(NOTE: this post is now redundant, Scala syntax highlighting in Sublime Text 2 is now dramatically better as of Build 2065, released on June 15th 2011)
At the recommendation of @roblally I decided to give Sublime Text 2 a spin, as I was starting to grow impatient with poor tooling for Scala development. I've been using IntelliJ 1.5 CE with the Scala Plugin and the SBT plugin for a week or so, but I've been getting really frustrated with how slow the whole setup has been and the inaccurate error reporting. So, perhaps it's time to go all caveman and go back to using just a text editor and SBT in a command line. I often use jEdit for simple editing of LaTeX and other such documents, but Rob believed Sublime would be worth a spin for coding oriented development. So, here we are.
So, I've been using it for all of about 15 minutes. The first thing that immediately popped out at me was that the syntax highlighting for Scala was incorrect - often in my Lift code where there are a number of objects declared within Mapper classes the syntax highlighting would pick out the object keyword on the first line, but not on subsequent lines. Minor point, obviously, but I like this kind of basic stuff to work or I don't see the point in having it at all. So, turns out the syntax highlighting in Sublime Text uses TextMate definition files, so with a quick Google I was able to find a github repo by mariussoutier that provides a better .tmLanguage definition file for Scala than that which ships with Sublime Text just now. The syntax file is here and you can just copy-paste the contents of that over the top of what exists in Sublime Text, and it seems to just work.
A blog of my observations and thoughts on programming languages, software engineering and computing science in general. Beards and opposable thumbs highly recommended.
Friday, 10 June 2011
Tuesday, 31 May 2011
Cross Compiling with Google Go
Since this is what I was doing 5 minutes ago, it seems like a reasonable first article for the reboot of the blog. A little side project I've been working on over the last couple of months has been an implementation of n-body physical simulation as part of the SICSA Multicore Challenge. The intention behind the challenge was to get as many different implementations of n-body simulation as possible, using a wide variety of languages, target architectures, and so on. I decided to give Google Go a spin, as I was interested in potentially using it as a research vehicle for my static analysis research. I will perhaps talk about my experiences of using Google Go and the n-body implementation in a later post - for the impatient, you may be able to glean some information from the presentation I gave at the workshop, or by looking at my horrible code.
When developing the implementation, I was using my trusty Macbook. However, I was intending to run the experiments on a linux based 8-core Xeon monster in the department, and was also interested in comparing performance of 32-bit and 64-bit binaries. I soon discovered that the standard Go compiler can do cross-compilation, though it wasn't obvious from the documentation how to do this, exactly. So, here's what to do. Firstly, build the compiler from source as you would normally:
This should go through without any errors. Though, if you're behind a proxy you may need to disable the network tests before running all.bash:
With a working build of Go, you should be able to happily develop away on your current machine. However, when you want to build a binary for a different operating system or architecture, you'll also need to build the tool chain (actually, the packages mostly) for the other platform. To do this, you'll need to set the environment variables GOOS and GOARCH to the appropriate values. See the Getting Started page for the list of options. With the environment variables set, you can run make.bash to build the appropriate packages for this alternate architecture.
I built versions for darwin, linux and freebsd in both 32-bit (386) and 64-bit (amd64) flavours.
Now, when it comes to building your program, if you followed the advice in "How to write Go code" you will probably just be running gomake. Now, before you do this simply set the GOOS and GOARCH values to the platform you wish to target. This will produce a statically linked binary for that os/arch combo, which you can then copy to your target machine to run. Simple!
To simplify things, I wrote a little script that would compile the binary for all the platforms I cared about. You can see this script here, though essentially all it does is
This could be simplified for many os/arch combinations using a little function inside the script, but copy paste was just as fast for me :)
When developing the implementation, I was using my trusty Macbook. However, I was intending to run the experiments on a linux based 8-core Xeon monster in the department, and was also interested in comparing performance of 32-bit and 64-bit binaries. I soon discovered that the standard Go compiler can do cross-compilation, though it wasn't obvious from the documentation how to do this, exactly. So, here's what to do. Firstly, build the compiler from source as you would normally:
> hg clone -u release https://go.googlecode.com/hg/ go
> cd go/src
> ./all.bash
This should go through without any errors. Though, if you're behind a proxy you may need to disable the network tests before running all.bash:
> export DISABLE_NET_TESTS=1
With a working build of Go, you should be able to happily develop away on your current machine. However, when you want to build a binary for a different operating system or architecture, you'll also need to build the tool chain (actually, the packages mostly) for the other platform. To do this, you'll need to set the environment variables GOOS and GOARCH to the appropriate values. See the Getting Started page for the list of options. With the environment variables set, you can run make.bash to build the appropriate packages for this alternate architecture.
> export GOOS=linux
> export GOARCH=386
> ./make.bash
I built versions for darwin, linux and freebsd in both 32-bit (386) and 64-bit (amd64) flavours.
Now, when it comes to building your program, if you followed the advice in "How to write Go code" you will probably just be running gomake. Now, before you do this simply set the GOOS and GOARCH values to the platform you wish to target. This will produce a statically linked binary for that os/arch combo, which you can then copy to your target machine to run. Simple!
To simplify things, I wrote a little script that would compile the binary for all the platforms I cared about. You can see this script here, though essentially all it does is
export GOOS=...
export GOARCH=...
gomake clean
gomake
mv -f binary binary_$GOOS_$GOARCH
This could be simplified for many os/arch combinations using a little function inside the script, but copy paste was just as fast for me :)
Monday, 9 May 2011
Starting from Scratch
Out with the old, in with the new...
I decided to "reboot" this blog today as part of a general drive to separate my professional and private lives. The blog which previously existed at this address was a mixture of both, but from now on it will be exclusively to do with my professional life, similar to my Twitter account. Personal stuff will go on Facebook, so if you know me, add me there.
A short summary of who I am and what I'm doing is in order:
(PREVIOUSLY ON...) The Professional Life of Iain McGinniss
- Worked for Sword Ciboodle as Technical Architect until 2008.
- Currently working part-time for Onedrum as a network specialist, contributing to the JXTA and Netty open source libraries on their behalf.
- Currently undertaking a PhD in Computing Science at the University of Glasgow
I work in the area of programming language semantics and type theory, though I try to stay connected with practical software engineering as much as I can. So posts here in future are likely to be a mix of theoretical and practical topics.
Subscribe to:
Posts (Atom)