I had a lot of problems with Opensuse Leap 42.2 and Lightworks. This time it was portaudio that was causing all the problems. It would either hang Lightworks when loading at the point portaudio tried to detected the audio interfaces. Or it would just be a mess of buffer overloads.
I fixed it in a weird way. I downloaded builds of libportaudio2 and libportaudioccp from Ubuntu, for the right architecture. I think unpacked them into /usr/lib/lightworks, which is where lightworks looks first for libraries. This worked, with one additional requirement that you have to use GlobalSettings.txt in your ~/Lightworks directory to make sure that portaudio uses pulseaudio as the interface. This avoids the buffer problems. Do all that and Lightworks 12.6 is happy on Opensuse Leap 42.2.
I am about to try Lightworks 14, see how that works before I pay for the upgrade.
I gave up on SpiderOak in the end, it was too resource heavy for my laptops, their batteries were really struggling. The constant encrypting and decrypting was too much. Its a shame because as a backup system for large amounts of data it was really very good. Especially if that data was on desktops or didn’t change much. I couldn’t justify keeping it for this use only, too expensive at $25 a month (which is about £20 now!). So I looked back into OwnCloud.
I tried OwnCloud a number of years ago as I run my own home server anyway. Back then the desktop sync client was just not up to the job. It would take forever to sort out changes and sync them, often using up entire CPU cores on both the server and the client machines. I am pleased to say that problem as been solved and now OwnCloud is a great choice if you are running your own server somewhere anyway. The sync seems to be fast, and the setup is easy. I am using a self-signed SSL certificate, which doesn’t cause much in the way of annoying warnings. OwnCloud has the other obvious advantage that you can really limit the access to the data, and it supports encryption of the data on the server too. Or you can encrypt the drives etc. I have been using it for about 4 months without problem. The only issue I had was when there was a power cut in my house while I was away. Doh! This killed the server obviously.
The downside of course is that with SpiderOak I had a ‘cloud’ copy of all my important data that was at a different location to my house. This is no longer the case. I don’t use OwnCloud to sync all the large data-sets across my computers, I sync those with Unison instead. So to avoid a catastrophic data loss in the event of disaster I have synced all my important data and docs to my office computer using Unison. This is about 1.7TB of data onto a 5TB raid array. The first sync took ages but as this data doesn’t change very much. I also use Amazon Glacier for data that can be essentially archived off and forgotten about, its super cheap.
I had to do it eventually. My security expert credentials (such as they are) were coming under question from my lack of https on my web server. This was partly due to that until recently it would have required additional expense, as additional IPs would been needed and also SSL certificates. Not so anymore! Let’s Encrypt is a great service where you can get free SSL certificates for your website, although I highly recommend giving a small donation if you are able. At this point I would normally write a blog post detailing how I setup the service for my server and websites. However as I run my server on Bytemark’s Symbiosis I didn’t have to do anything other than turn it on. All config done automatically, I did add a file named ‘ssl-only’ to the ‘config’ dir of my hosts to ensure that all traffic is redirected to https. I would have done it sooner but I needed to upgrade the OS on my server and in order to do that I needed to check backups and all that stuff that you should do anyway.
This is the first time I have travelled for an extended period of time with only my Chromebook. I would normally take a laptop if I was going to be away for more than a couple of days. Not only that I have to write and then present a talk at a conference. So I need Google Slides to do the business. I was reasonably confident it would, there is however one snag.
I have had bit of a faff copying some sections out of a PDF file to go into my presentation. There is no way to easily copy and paste from any of the PDF tools I have used and Slides will not import a PDF file as an image. This is either because the PDF support in Google Docs just isn’t very mature or a priority for Google or, and I have my suspicion this is part of it, Google doesn’t want you copying out of PDF files. I am not entirely sure why this might be, perhaps they are trying to protect copyright? However, it can be a pain if you are trying to use a figure as part of a talk.
I have found what is a fairly good solution, in fact it might be enough of an obvious (when you figure it out) solution that there isn’t a problem here after all. I learnt how to take screen shots with the Chromebook. Either by:
ctrl+’task select’ and you get the whole desktop, or
ctrl+shift+’task select’ and you can draw a box round a section
Easy, you can either copy the image to the clipboard or save it as a file. Probably easy enough that my conspiratorial ideas of Google somehow blocking it do not really make sense. Just took me a while to figure it out. 🙂 You can even zoom into the bit of the PDF you want before grabbing the screenshot.
I have been cleaning up and resurrecting a beast of a computer. Its a dual Xeon 3.16GHz, with 8GB of RAM. Its not that high spec compared with today’s hardware. Has a fairly average GPU and a RAID card with currently 7x 500GB HDs, and 1TB system disk. I think I will get it going. Maybe find a use for it. Perhaps as web scraper or something. That or I could sell it. Gonna have to get a HD off eBay to make up the 8, that and find the Sata PSU cables!
Two of my Opensuse boxes where performing really badly, but only some of the time. I tracked this down to the Baloo file indexing program. This is the tool that indexes your files for search, which is useful but only if it doesn’t cause horrendous performance problems.
The symptoms were a very frustrating desktop performance where the mouse would stutter and every few seconds or so, the system would sort of stop and then start again. It seemed to affect all apps to the point where they were basically unusable. It feels like a constant interrupting of the system, which it might well be.
The solution! Disable Baloo using the following command:
Then reboot the system and all will be well. I rarely used the file indexing anyway. It would often crash on its own.
I have been going through a process of tidying up! So I have got a server rack and put a load (3) of my computers in it. I got it free, thanks York University, and all it needed was a new wheel. Its working out pretty well. the Top one is my home server, the middle a Windows machine, and the bottom is my quad Opteron system that doesn’t get used much these days. Bit of a relic from the PhD days of ploughing though lots of simulations.
I switched from Dropbox to SpiderOak a while ago. For a couple of reasons. One, I wanted to upload more things to the cloud for backup, video projects etc, and SpiderOak have a personal 5TB plan. I was also starting to get nervous about security and I think the encryption system in place by SpiderOak is better than Dropbox. However, this transition has its downsides.
The SpiderOak client is not as slick as Dropbox, in fact it looks rather retro. I don’t care about the retro, but the interface takes a bit of getting used to. That said, once you get your head round it it is not difficult to use. The main downside is that if you change a lot of files on SpiderOak then your computer is going to be spending a lot of time encrypting and decrypting files. Uploading a load of photographs downloaded from my digital camera took a fair while on my MacBook Pro, and that is a fair while of one core at 100% and the fan spinning away like crazy.
That is not the only way you can flatten a laptop battery very quickly. While being a Dropbox user I got into the habit of just working straight out of Dropbox. So I would incrementally save files into the Dropbox folder, and Dropbox would dutifully upload this file into the cloud. The situation with SpiderOak is somewhat different. The save event has to be detected, and then the file is encrypted into some sort of bundle of files. This is then uploaded to the cloud. I have just finished working on a Word document and when I looked at the upload queue in the SpiderOak client, three versions of the file where there waiting to go. I have also been working on my laptop for maybe 3 hrs from full charge. The relatively large amount of work that SpiderOak is doing to do all that encryption means I have warm laptop with 28% battery… not so good.
I think this system also causes problems for SpiderOak. The client reports that I am using 1.6TB of my 5TB of space. However, when the client reports the size of my files on their severs, its over 5TB. This is because of all those duplicate encrypted files. I am not sure how they go about solving that problem. The big duplications were caused when I changed round some computer systems and the client believed that the files on the hard disks were all new version of the entire content I store on the cloud. Oh dear.
So what to do. I like the extra storage space, and I like the encryption. I don’t like the battery usage or the CPU hammering. Its fine on my huge desktops that have 8+ cores and tons on ram. I don’t even notice. At the very least, live files like that Word doc, might have to be edited outside of SpiderOak and then copied in. That or I go looking for another solution.
I don’t spend much time with Windows, my desktops are largely Linux and my laptops Macs. I do have one rather ageing Windows machine, that recently got borked by an update. It was when I was reinstalling the drivers and software that Windows update couldn’t find that I came across DriverTuner. What a hateful piece of software this is. I have been using computers for years, and I started early enough to remember the days when you had to manually set IRQs on hardware with little jumpers. So tracking through a website using the model numbers of my hardware isn’t much of a hardship. I don’t believe it would be to anyone really, we can all navigate websites…
Therefore I found it extremely irritating that when trying to install both my Canon printer and Epson scanner that I kept navigating to a specific page only to download DriverTuner. On the Epson site it even downloaded in a zip file that looked like the correct version of Epson Scan. DriverTuner is irritating because it doesn’t seem to work, it failed to find anything to install on my computer. Not only that it also seemed to want me to buy it. Would it have made me buy it to get my drivers? To me DriverTuner is nothing more than an annoying, overly complex, failed, solution to a almost non-existent problem. Another piece of crapware clogging up my system.
Canon at least had a version of the drivers and software somewhere on their site that I was eventually able to locate and download. It installed and worked. Epson, not so. I had to resort to chatting to a helper to get a link to a page that I could download the driver from. Why, manufactures, insist on making everyone download this software? Why not let people try the automatic method if they want, and let those of us who want to just go to site and get the driver without any fuss do that?
So a resent update caused no end of problems with my Windows 10 machine. It would repeatedly fail to install the update, and then when it managed it I couldn’t log into the machine. User profile not accessible. I decided that it was probably time to re-install my machine anyway. However it seems the problem came back. A little different this time.
I reinstalled but chose to leave my Nvidia Soft/Fake RAID partition unchanged as I used it as my user space. The machine re-installed, and everything appeared to be fine. I could get to the RAID array, so I left it as is and went about installing drivers and updates. Updates! This is the problem. It seems I have become one of the many victims of a Windows 10 update that stops Nvidia RAID arrays from working (see here in German). So no sooner had I updated the machine and restarted, the RAID array vanished again. So annoying. This machine is a little old but it is by no means obsolete. Its a dual quad-core opteron with 16GB of ram and a half decent GPU. Plenty of live left in it yet, esp as all I use it for is scanning and printing photographs. The RAID array still reports as healthy at boot. I suppose I should have known not to trust a fake/soft RAID long term. However, wasn’t expecting an update to do it in. So what to do… Buy a cheap RAID card off ebay and use that, turn off Nvidia RAID and use the Windows fake RAID instead? Whatever, they should have warned us of this potential if they knew.