Lenovo ThinkPad Carbon X1 (20KH) 6th Gen Linux (Ubuntu 18:04)

I have received a new Lenovo ThinkPad Carbon X1 (6th generation, mine is the 20KH model with NFC and the higher res screen) which I have been trying to get Kubuntu 18:04 LTS Linux running on. This has been largely successful with a few problematic areas. Mostly the trackpad. I am a fan of mouse pointer emulation, which allows you to do one finger tap for mouse left click, two finger for right click, and three for middle. This is what I haven’t been able to get working really.

The Trackpad

The problems with the trackpad are that it would be slow to start working on boot, and often when waking from sleep. Also mouse button emulation intermittently failed. This would be accompanied with CPU usage spikes that I think were some sort of driver crashing, trace state, or similar.

Edit: This is probably the fix we are looking for! Install xserver-xorg-input-synaptics (which for me is xserver-xorg-input-synaptics-hwe-18.04). The left ‘i2c_i801’ commented out of the ‘/etc/modprobe/blacklist.conf’. This seems to have got it. I now have a trackpad that works on boot and after waking from sleep. I also have the config controls in the settings. So I think this is the way forward. In addition, as the physical buttons did occasionally stop working after reboot, I suggest getting the pm-utils to run the following commands at wake up which I put in a bash script.

#!/bin/bash
#reconnects trackpad after sleep

. "${PM_FUNCTIONS}"

case "$1" in
    resume|thaw)
        echo -n "none" | sudo tee /sys/bus/serio/devices/serio1/drvctl
        echo -n "reconnect" | sudo tee /sys/bus/serio/devices/serio1/drvctl
    *) exit $NA
       ;;
esac

exit 0

The best place for the bash script in Ubuntu 18.04LTS is in a file in /usr/lib/pm-utils/sleep.d/ which I called 95-trackpad.

Possible fix one: Getting the trackpad working might require that ‘i2c_i801’ is commented out of the ‘/etc/modprobe/blacklist.conf’. This will fix the problems with it being detected on start and coming out of sleep.

However… possible fix two: You might have more success with leaving ‘i2c_i801’ on the blacklist and adding ‘psmouse.synaptics_intertouch=1’ to you grub config (‘/etc/default/grub’) and then running ‘sudo update-grub’. This seems also to work.

So this is the problem with intermittent faults. Getting to the bottom of the problem and the solution is very difficult. So I would suggest you try both and see what works better.

It will not however fix the problems with mouse click emulation intermittently failing. That I have not found a fix and instead switched it off altogether. You are then left with using the mouse buttons.

Sleeping

To get sleeping working correctly you need to make use that in the BIOS the sleep mode is set to Linux instead of windows. That is it. Without that then the system will not come in and out of sleep properly.

Other than these issues, and the WAN modem not working (no driver), the laptop seems to work well.

Posted in Hardware, Linux Tips, Technology | Tagged , , , , , | Comments Off on Lenovo ThinkPad Carbon X1 (20KH) 6th Gen Linux (Ubuntu 18:04)

Moving Windows 10 install to an SSD

My fairly old Windows 10 machine was running very slowly, it was extremely annoying and I didn’t want to invest in a new machine. So moving the Windows 10 install to an SSD seemed like a option worth trying.

I didn’t want to get a new machine for two reasons really, one was I didn’t want to have to move all the data and apps etc to a new computer, and also I don’t use Windows much (so motivation was lacking). So investing in a new machine was also not really a high priority. I looked at the performance monitoring systems in Windows to try to figure out what the problem was. The machine at the time had 16GB of DDR2 ram, and 2 quad core AMD Opteron 2386 SE Processors (2.8GHz). Which, although isn’t super high spec (anymore) its not bad.

What was the problem?

Looking at the performance monitoring it became clear that it was the old system disk that was a 32oGB hard drive. Which was always at 100% use and just being hammered all the time. So decided that it was worth a shot to replace it with a SSD, to see if that would improve things enough to make the machine usable. If not I could use the SSD in a new system anyway.

The SSD Upgrade

I bought a 250GB SSD drive, which was plenty big enough, and cost about £30. The next question was what was the best way to move the system drive over to the new disk. In such a way that nothing would then need to be re-installed. On Linux this is fairly easy, there are a load of free software that can do this. I wasn’t so sure on Windows. However after a quick search I found a EaseUS Todo Backup Free 11.5 which worked very well. The tool has a disk or partition clone mode, which is the bit you want. Here you have to make sure that you select the whole system drive, as there is a bunch of recovery partitions etc. So in the picture below I would choose Hard Disk 0, which selects all the partitions.

Hard disk cloning tool, which the source drive selected.

One the next screen you select the target drive. I cannot pick the target, as its not there. However you would select the SSD drive. For an SSD there is an additional step, where you should click ‘advanced options’ and and select optimise for SSD.

Select the target drive and if it an SSD click Advanced Options and select optimise for SSD.

Once you have picked the target click next and then it is a simple process to start cloning the system drive onto the new SSD drive. This will take a little while and you can set the machine to shut itself down at the end.

You then just swap the drive over, and you might need to tell your computer to boot off that drive in the BIOS. It depends on if you have a load of other drives in the computer. All being well the computer will just boot up as normal, only with a bit of luck it will boot a lot quicker.

What was the outcome?

When I swapped the drive over I also put in different memory. I had a lot of memory from a server, so I put 32GB of ram in the machine at the same time. So I upgraded it to the SSD and 32GB of ram. The difference is huge!! It was well worth it, the machine is very much usable again. It boots in under a minute, and works well for what I use Windows for. This is a great option to put some life into an old machine. The CPUs and GPU in this computer are old, but the machine will do pretty much anything most people need.

You might have problems with updates after this saying that they cannot update the System Reserved partition. If you do then check out this guide from Microsoft about how to fix it.

Posted in Hardware, Technology, Windows | Tagged , , , | Comments Off on Moving Windows 10 install to an SSD

TensorFlow GPU Kubuntu LTS

I wanted to install TensorFlow GPU on my Kubuntu System (currently 16:04 LTS but I believe it also works for 18.04LTS). I am interested in trying TensorFlow with both Java and Python, I have done some testing with Python so far. Java I will try later.

I first updated my system to ensure that all the packages were up to date, and I also used the Driver utility to install the NVidia Driver, version 396.54. TensorFlow uses Cuda 9.0 at the minute so you have install the version, you can get it here. I use the local runfile, this is as it is an easy way of getting cuda installed in /usr/local and the samples installed in your home directory. If like me you have already installed the Nvidia drivers, skip the step to install the drivers as its not required.

Download screen.

I choose the local run file.

Once you have it downloaded run it, remembering to skip the driver install if you have them installed already.

sudo ~/Downloads/cuda_9.0.176_384.81_linux-run

TensorFlow also needs Nvidia cuDNN to work, you have to join as a Nvidia developer and then you can download this, pick the version for Cuda 9.0.

cuDNN download

Choose the version for cuda 9.0.

This can easily be installed in the terminal.

tar xvzf cudnn-9.0-linux-x64-v7.2.1.38.tgz
sudo cp -P cuda/include/cudnn.h /usr/local/cuda-9.0/include
sudo cp -P cuda/lib64/libcudnn* /usr/local/cuda-9.0/lib64
sudo chmod a+r /usr/local/cuda-9.0/include/cudnn.h /usr/local/cuda-9.0/lib64/libcudnn*

Make sure that your .bash_profile file in your home dir contains something like (I am not sure the last one is needed, seems to be in there from ages ago):

export PATH=/usr/local/cuda/bin:$PATH
export LD_LIBRARY_PATH=/usr/local/cuda/lib64:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/usr/local/cuda/lib64:/usr/local/cuda/extras/CUPTI/lib64"
export CUDA_HOME=/usr/local/cuda

Note that I have a symlink from /usr/local/cuda to /usr/local/cuda-9.0/. That is cuda installed, you can check it is working by compiling one of the samples that use it.

Next install TensorFlow, I did this in the python evironment, the full details are here. As I have a Tesla K20 in this system I installed the version with GPU support. I use python 3.5.

#install pip and virtualenv
$ sudo apt-get install python3-pip python3-dev python-virtualenv

#upgrade pip
$ sudo pip install -U pip

#make a dir to work out of
$ mkdir ~/tensorflow
$ cd ~/tensorflow

# I chose Python3 environment for the environment directory:
$ virtualenv --system-site-packages -p python3 environment_dir

#activate the environment
$ source ~/tensorflow/environment_dir/bin/activate

#in the virtual environment upgrade pip
(environment_dir)$ pip install -U pip

#then install tensorflow with gpu support
(environment_dir)$ pip install -U tensorflow-gpu

#check it works
(environment_dir)$ python -c "import tensorflow as tf; print(tf.__version__)"

That should be it!

Posted in Uncategorized | Comments Off on TensorFlow GPU Kubuntu LTS

Akroma Mining with Tesla K40

Following on from my post about mining ZCash I decided to mine some Akroma coin, I like the look of the project as it has some nice features and who knows. Maybe it will take off! I don’t have a lot of mining hardware, but I have a test system for messing about with this stuff. I do mining more for interest, and I buy some coins for investments. Which are doing ok.

I downloaded the miner from here, it is a version that has the dev fee, but I am fine with that as making software is hard work. You need to get a wallet, I chose the web wallet as that seems like a reasonable option at the minute. In general I prefer stand alone wallets but this is fine until there is a better option for Akroma.

I am running the miner in eth only mode on Ubuntu Linux using the following command:


./ethdcrminer64 -epool stratum+tcp://geo.pool.akroma.eu:8001 -ewal YOURWALLETADDRESS -eworker workerx -epsw x -asm 1 -allpools 1 -di 1

On the K40 I get about 11.6 Mh/s, which is slightly less than I get with a K20 (12.14 Mh/s). No idea why that might be, it could be differences with the Cuda install perhaps. I think the system with the K20 is more up-to-date. If I leave both systems running I will get about 7-8 coins in 24hrs, I will probably only run them together for a day or so and the leave one going for about 4 coins a day.

I will leave this mining for a while, generate some coins and then perhaps go back to ZCash, or try something new! I would like to build a small rig, I have an old system that could have about 3 GPUs in it. That might be the place to start, it would be cool to get one of those multi-gpu cases that could have about 4 GPUs in.

Posted in cryptocurrency, Linux Tips | Comments Off on Akroma Mining with Tesla K40

OS X High Sierra Allow Apps in Security & Privacy

I use Mac laptops, up till now they have been close enough to linux and useable that I like them. However increasingly they are starting to annoy me and my next laptop might be linux only. One example of how annoy they are is this new ‘feature’ in OS X High Sierra where if you are installing something like a system extension, when you go to Security & Privacy to ‘allow’ it, pressing the ‘allow’ button doesn’t work.

For me I was enabling a file system feature of an app, in order to do this I had to allow the app, by going to Security & Privacy and allowing the extension to install. However, I go there, unlock the settings and press ‘allow’. Nothing! Nothing happens, the button remains and the system extension isn’t allowed. So annoying, this is my computer! Have I no control!?

There is a ridiculous fix, and that is too use an apple script to ‘click’ in the correct place. The code is below, to get the coordinates that you need use cmd-shift-4 and you get the screenshot crosshairs that will give you the coordinates that go in the {}.

osascript -e 'tell application "System Events" to click at {584, 819}'

Posted in OSX | Comments Off on OS X High Sierra Allow Apps in Security & Privacy

Twitter4J Extended Mode – 280 Character Tweets

Looking to use Twitter4J to download the full 280 characters of extended tweets? I can easily be done. You need to have a relatively recent version to enable extended mode, I have tested this with version 4.0.6 and it works well.

Its pretty straightforward and you enabled it at the configuration builder phase.

twitter4j.conf.ConfigurationBuilder()
      .setOAuthConsumerKey(""))
      .setOAuthConsumerSecret("")
      .setOAuthAccessToken("")
      .setOAuthAccessTokenSecret("")
      .setTweetModeExtended(true)

Thats is, the setTweetModeExtended(true) is all you need.

Posted in Programming | Comments Off on Twitter4J Extended Mode – 280 Character Tweets

Graph Visualisation with Neo4J and Alchemy

I have been using the Alchemy.js library to do some graph visualisation for netorks. Its pretty good, my default template is below as I had some trouble with those I found on around the internet.

It makes a simple graph with node labels visible, and takes the data as a .json file. I produce the data file from an R script that extracts data out of the Neo4J database. The basic code for that is below. One thing to remember with this is that the data file with the network in is easily accessible to anyone that might want to download it. So if it constitutes a lot of work and you don’t want anyone to get hold of it (e.g. if it is research you intend to publish at a later point), this isn’t the method for you.

<html>
<head>
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/alchemyjs/0.4.2/alchemy.min.css" />
</head>
<body>
    <h1>Network Visualisation</h1>
  
    <div class="alchemy" id="alchemy"></div>

    <script src="http://cdn.graphalchemist.com/alchemy.min.js"></script>

    <script src="https://d3js.org/d3.v3.min.js" charset="utf-8"></script>
    <script src="https://cdnjs.cloudflare.com/ajax/libs/alchemyjs/0.4.2/alchemy.min.js"></script>
    <script src="https://cdnjs.cloudflare.com/ajax/libs/alchemyjs/0.4.2/scripts/vendor.js"></script>
  
    <script type="text/javascript">
         var config = {
              dataSource: 'data.json',
              forceLocked: false,
              graphHeight: function(){ return 1000; },
              graphWidth: function(){ return 1000; },      
              linkDistance: function(){ return 40; },
              nodeCaptionsOnByDefault: true,
              "nodeCaption": 'name', 
              "nodeMouseOver": 'name'
            
          };

          alchemy = new Alchemy(config)
     </script>
</body>
</html> 

Here is the R code that generates json data file from the Neo4J database, its really simple. Connect to the local running Neo4J database, run a query to get the nodes, run a query to get the relationships (edges), and then I did some quick clustering analysis in iGraph. You need to do a bit of work to get the communities into the dataframe, if you cluster. You then need to do a little work to get the json file. Its pretty good an not a lot of code to get a nice visualisation from your Neo4J database.

library(RNeo4j)
library(igraph)
neo4j = startGraph("http://localhost:7474/db/data/")

nodes_query = "MATCH (a) RETURN DISTINCT ID(a) AS id, a.name AS name, labels(a) AS type"
nodes = cypher(neo4j, nodes_query)

rels_query = "MATCH (a1)--(a2) RETURN ID(a1) AS source, ID(a2) AS target"
rels = cypher(neo4j, rels_query)

ig = graph.data.frame(rels,directed=TRUE,nodes)
communities = edge.betweenness.community(ig)

memb = data.frame(name = communities$names, cluster = communities$membership)
nodes = merge(nodes, memb)

nodes_json = paste0("\"nodes\":", jsonlite::toJSON(nodes))
edges_json = paste0("\"edges\":", jsonlite::toJSON(rels))
all_json = paste0("{", nodes_json, ",", edges_json, "}")

sink(file = 'data.json')
cat(all_json)
sink()
Posted in Programming, Technology | Comments Off on Graph Visualisation with Neo4J and Alchemy

ZCash Mining With Nvidia Telsa

I have two Nvidia Telsa cards that I use for work, they are good for certain tasks around data mining and machine learning, and I have been interested in trying to use them for Zcash mining. I have a Antminer U3 that mines Bitcoins (slowly, and not that well sometimes), however I fancied a go at mining Zcash too. How hard could it be? Harder than it should have been.

Being Cuda cards I would obviously need a cuda enabled miner to use them. I already mine with Antpool, and they allow users to mine Zcash so that bit was easy. What wasn’t easy was trying to get a miner to compile. I tried nheqminer, that wouldn’t compile and also the newest version doesn’t work with the tesla cards I have as they are not compute 5. The older versions wouldn’t compile either. I also had a few problems with getting cuda running on Kubuntu 16.04 as I needed to upgrade the nvidia drivers which was a pain!

Solution

I got there in the end as I found a binary of nanopool’s ewbf-miner that works with cuda cards! This works great, I get about 70 Sol/s on the K20 and 95Sol/s on the K40. So that is pretty good, you would get about 40 Sol/s on a i7-6700K CPU @ 4.00GHz. I am sure that with a newer telsa or CPU you would get more. This only an experiment however, not a mining operation so I am happy.

Posted in cryptocurrency, Linux Tips, Technology | Comments Off on ZCash Mining With Nvidia Telsa

Kubuntu 16.04 LTS

One of my Opensuse workstations fell over. Weird problem with the login, the sddm-greeter was crashing on start. I reinstalled a bunch of things but it would not come back to life. I decided that the system could do with a restart, the partitioning was a mess and the bootloader was badly installed. The system hadn’t been reinstalled for a lone time.

I decided that I need this system to be stable, and Opensuse gets updated too frequently for some of the software that I run leaving me solving problems with missing libraries or incompatible versions of software. A Linux distro with a longer lifecycle was in order.

My initial thought was to install Centos 7, its RPM based (what I am used to) and has the required stability. However I had problems with Centos and NVidia drivers! Once installed the system would login! Rather than spend ages figuring out how to fix it I decided to which to Ubuntu, or rather Kubuntu (I am a fan of KDE). As a user of Debian for my servers I wasn’t expecting any problems. BTW the partitioner in the Centos install is a mess, they need to sort that out.

Kubuntu was straightforward to install, no problems and the partitioner is easy (take note Centos). I did have a problem that I have encountered before with Ubuntu was that the GUI crashed shortly after system start. This seems to be a problem with the opensource NVidia driver. If you drop to runlevel 3 you can install the proprietary driver and its fine. Which is ok if you don’t mind using it. So far getting the system setup has been painless and I am happy to have the system going again without problem. Now to restore all the data and apps!

Posted in Linux Tips, Technology | Tagged , , , , , , | Comments Off on Kubuntu 16.04 LTS

Syncthing

I seem to be constantly trying out different ways of syncing data. I have had no end of problems with unison. Apart from being slow getting build versions to match is a constant source of frustration.

However,  I have found Syncthing, which seems great so far. I have owncloud, which is great for syncing documents; think Dropbox that you own. However I have some data sets that are very large and don’t change much. Owncloud would just get clogged up looking after these. I could use rsync, which is fine, however syncthing has some nice additional features.

Syncthing is a bidirectional sync tool, that can also do versioning if you want it to. It works by running as a web service on your computers and is accessible via a web gui. It keeps directories up to date over a number of computers, there is no copy of the files in the cloud, unless you put one there. So if you delete everything off one computer it will disappear from them all (eventually). It plays well with dynamic IP addresses and firewalls. All traffic between the computers is encrypted, and it is fast. It doesn’t use up too much resource, and I set the sync frequency on my files for 1hr as they are unlikely to change that often, and also only I use them. These are large data sets of source material mainly, and photos that I edit on two computers.

Setup is pretty straightforward, the most tricky bit being auto starting the service on different computers/operating systems. It works well.

Posted in Linux Tips, Technology | Comments Off on Syncthing