It turns out that even though the Mint team make better decisions than the Ubuntu team, they’re not perfect. Hidden away in Mint’s network configuration are the OpenDNS servers set up as a fallback in case your usual DNS servers fail.
Unfortunately, this appears to be interacting oddly with Chrome resulting in the fallback servers being used for all requests. This has been causing all sorts of problems for me recently, most notably some very aggressive blocking of sites for no apparent reason (stackoverflow being one example).
This post contains massive spoilers if you haven’t already completed Halo 4 (and, to a lesser extent, the other games in the series). Continue reading at your own risk.
As a long-time fan of the Halo series across all media, be it games, books, or videos, the fourth video game installment in the Master Chief/Cortana story arc really piled on the feelings. (“Dem feels” as I believe the kids of today refer to such things.) Interestingly, I haven’t played through the game myself—the timing of the release wasn’t good for me and in the end impatience won out and I instead watched the playthrough published to Youtube by PauseUnpause. That first time I just sat back and absorbed the story. Recently, however, I watched through a compilation of the cutscenes, along with some important moments during player control, and considered what factors caused the game to really hit the target emotionally. I believe that it comes down to three essential factors, as described below in no particular order.
I popped over to Rezzed today to check out a few games in person. Of particular interest to me was the Positech Games stand, manned by Cliff Harris.
I have a bit of a history with one of their games, Gratuitous Space Battles, since hanging around on the modding forum and realising that a tool similar to the old SunEdit would make everyone’s life easier (while I was sitting in the reception at a hotel in Egypt of all places). A bit of coding in my spare time led to the creation of GSBEdit, and (other than a two year timeout) I’ve been trying to make it a useful tool ever since.
While I was at the Positech stand I got a look at Democracy 3 in person. It’s a very detailed political simulator and I have to say, it’s the first time I’ve seen people show so much excitement while discussing policy ideas. The interface seems surprisingly intuitive – having not seen the interface previously, I could still immediately see how it functions and how all the different factors relate to each other (and trust me, there are a lot). It seems like it’s going to be a game that will be easy to sink much more time into than you expected.
There is a lot of confusion by users on how to deal with LVM groups when trying to recover their files from old drives (or, in our case, a newer drive that somehow got set up in that way). If all you want to do is be able to copy the files off then the following post is by far the simplest way I’ve seen:
When looking at sensor datasheets or similar documents there is a subtlety about relative error that may not be immediately apparent. Relative error is quoted as a percentage of the reading, but in some cases the range of interest to you might be small and have a large offset from zero.
Let’s take an extreme case. Imagine you have a temperature sensor that reads in Kelvin and you are measuring human skin temperature. It’s contrived I know, but bear with me here. In this case, your measurement range of interest is going to be around 300 K to 315 K. Now image you have a sensor with a relative error of 0.5%. Sounds good doesn’t it? Unfortunately that means that at a reading of 305 K (31.85 °C), your error is going to be 1.525 °C. That is slightly over 10% of your range—not so great anymore. Note that I was optimistic with the range size as well (you’re probably not going to be measuring skin temperatures around 40 °C), which makes the error even worse.
How does this translate to a more sensible measurement approach? In most cases you won’t have to worry. Op-amps historically have problems when the inputs have large DC offsets because the difference in the inputs becomes tiny versus the absolute input values. Weighing people is something I’m looking at currently, where a 1% relative error in the scales becomes more like a 2.25% error compared to the useful range. Not a crippling problem, but something to be aware of.
One problem I’ve encountered using Mint (though it’s not actually a Mint issue) is that xfig doesn’t get associated with .fig files. This is compounded by the fact that GNOME 3 (the basis of Cinnamon) always knows better than you, so there is no way to set your own custom file associations—if the application isn’t on the “approved” list it can’t be set.
After some research, I found some useful information and here I’ll describe how to apply it to the xfig situation specifically.
- I will assume that xfig is already installed. If not then install it now (
sudo apt-get install xfig).
- Check that a .desktop file exists to inform the system that xfig is available. To do this run
sudo gedit /usr/share/applications/xfig.desktop and check that the file roughly matches the following. If it is empty then insert the following.
- Set xfig as the default application for .fig files. Run
sudo gedit /usr/share/applications/defaults.list and insert the following anywhere (though locating it with the other “
image/...” entries may be sensible).
- While you’re at it, install
gsfonts-x11 then log out and back in. This will avoid issues with Times, Helvetica, and so on in your images.
When using subfloats (or subfigures as they are sometimes referred to) in LyX you may come across an error message when exporting telling you that you cannot use spacefactor in vertical mode. The error message isn’t terribly helpful, but the solution is to add makeatletter to the end of your preamble.
Last time I covered easily reading data from a file into a useful structure. This time I will be demonstrating some summary statistics that can be generated from the data.
On occasion I need to convert a series of images into a movie. For example, I have a data visualiser which generates an image for each timestamp and it nice to be able to use these to view the evolution of the sensed values over time. If you search on this topic then you’ll find a lot of people suggest using ffmpeg with a command line such as the following:
ffmpeg -r 10 -b 1800 -i %03d.jpg test1800.mp4
However, this assumes that all the filenames are sequentially numbered and that there are none missing (it will finish when it reaches the first missing number). I needed something a little more flexible, and eventually found this page which suggests the following:
mencoder "mf://*.jpg" -mf fps=10 -o test.avi -ovc lavc -lavcopts vcodec=msmpeg4v2:vbitrate=800
This will simply read the files in the usual alphabetically sorted order. You can change "mf://*.jpg" to adjust which files it uses, along with tweaking the framerate and bitrate.
Contrary to the note on that page, more recent versions of mencoder support png files as input.
I had occasion to install Drupal 7 on my own Ubuntu server recently. Many of the guides I found went into some quite unecessary steps, involved liberal use of tasksel, or made a variety of assumptions about your environment. In the end, though, I found this guide, which is short, simple, and just worked.
The only addition I required was to run a2enmod rewrite and to edit /etc/apache2/sites-available/drupal to have AllowOverride All in the entry for the /var/www/drupal/ directory.
Job done in almost no time.