If you are reading this blog post, you are actually reading a historic and version of my homepage that I keep around soley for the purpose of looking at it again when I am old, just like the previous two instances. So unless you are me from the future, you likely want to have a look at my current homepage.
At my university, we recently held an exam that covered a bit of Haskell, and a simple warm-up question at the beginning asked the students to implement last :: [a] -> a
. We did not demand a specific behaviour for last []
.
This is a survey of various solutions, only covering those that are actually correct. I elided some variation in syntax (e.g. guards vs. if
-then
-else
).
Most wrote the naive and straightforward code:
last [x] = x
last (x:xs) = last xs
Then quite a few seemed to be uncomfortable with pattern-matching and used conditional expressions. There was some variety in finding out whether a list is empty:
last (x:xs)
| null xs == True = x
| otherwise = last xs
last (x:xs)
| length (x:xs) == 1 = x
| otherwise = last xs
last (x:xs)
| length xs == 0 = x
| otherwise = last xs
last xs
| lenght xs > 1 = last (tail xs)
| otherwise = head xs
last xs
| lenght xs == 1 = head xs
| otherwise = last (tail xs)
last (x:xs)
| xs == [] = x
| otherwise = last xs
The last one is not really correct, as it has the stricter type Eq a => [a] -> a
. Also we did not expect our students to avoid the quadratic runtime caused by using length
in every step.
The next class of answers used length
to pick out the right elemet, either using (!!)
directly, or simulating it with head
and drop
:
last xs = xs !! (length xs - 1)
last xs = head (drop (length xs - 1) xs)
There were two submissions that spelled out an explicit left folding recursion:
last (x:xs) = lastHelper x xs
where
= z
lastHelper z [] :ys) = lastHelper y ys lastHelper z (y
And finally there are a few code-golfers that just plugged together some other functions:
last x = head (reverse x)
Quite a lot of ways to write last
!
I have an office with a nice large window, but more often than not I have to close the shades to be able to see something on my screen. Even worse: There were so many nice and sunny days where I would have loved to take my laptop outside and work there, but it (a Thinkpad T430s) is simply not usable in bright sun. I have seen those nice eInk based eBook readers, who are clearer the brighter they are. That’s what I want for my laptop, and I am willing to sacrifice color and a bit of usability due to latency for being able to work in the bright daylight!
So while I was in Portland for DebConf14 (where I guess I felt a bit more like tinkering than otherwise) I bought a Kobo Aura HD. I chose this device because it has a resolution similar to my laptop (1440×1080) and I have seen reports from people running their own software on it, including completely separate systems such as Debian or Android.
This week, I was able to play around with it. It was indeed simple to tinker with: You can simply copy a tarball to it which is then extracted over the root file system. There are plenty of instructions online, but I found it easier to take them as inspiration and do it my way – with basic Linux knowledge that’s possible. This way, I extended the system boot script with a hook to a file on the internal SD card, and this file then runs the telnetd
daemon that comes with the device’s busybox installation. Then I just have to make the device go online and telnet
onto it. From there it is a pretty normal Linux system, albeit without an X server, using the framebuffer directly.
I even found an existing project providing a VNC client implementation for this and other devices, and pretty soon I could see my laptop screen on the Kobo. Black and white worked fine, but colors and greyscales, including all anti-aliased fonts, were quite broken. After some analysis I concluded that it was confusing the bit pattern of the pixels. Luckily kvncclient
shares that code with koreader
, which worked fine on my device, so I could copy some files and settings from there et voilá: I now have an eInk monitor for my laptop. As a matter of fact, I am writing this text with my Kobo sitting on top of the folded-back laptop screen!
I did some minor adjustments to my laptop:
xrandr
’s --panning
option this is possible even though my real screen is only 900 pixels high.</p>
).:set syntax=off
in vim).All this is still very manual (going online with the kobo, finding its IP address, logging in via telnet, killing the Kobo's normal main program, starting x11vnc
, finding my ip address, starting the vnc client, doing the adjustments mentioned above), so I need to automate it a bit. Unfortunately, there is no canonical way to extend the Kobo by your own application: The Kobo developers made their device quite open, but stopped short from actually encouraging extensions, so people have created many weird ways to start programs on the Kobo – dedicated start menus, background programs observing when the regular Kobo app opens a specific file, complete replacements for the system. I am considering to simply run an SSH server on the device and drive the whole process from the laptop. I’ll keep you up-to-date.
A dream for the future would be to turn the kobo into a USB monitor and simply connect it to any computer, where it then shows up as a new external monitor. I wonder if there is a standard for USB monitors, and if it is simple enough (but I doubt it).
A word about the kobo development scene: It seems to be quite active and healthy, and a number of interesting applications are provided for it. But unfortunately it all happens on a web forum, and they use it not only for discussion, but also as a wiki, a release page, a bug tracker, a feature request list and as a support line – often on one single thread with dozens of posts. This makes it quite hard to find relevant information and decide whether it is still up-to-date. Unfortunately, you cannot really do without it. The PDF viewer that comes with the kobo is barely okish (e.g. no crop functionality), so installing, say, koreader
is a must if you read more PDFs than actual ebooks. And then you have to deal with the how-to-start-it problem.
That reminds me: I need to find a decent RSS reader for the kobo, or possibly a good RSS-to-epub converter that I can run automatically. Any suggestions?
PS and related to this project: Thanks to Kathey!
Das letzte halbe Jahr war es sehr ruhig um unser Tip-Toi-Projekt. Das erste Ziel war damals erreicht: Ich konnte existierende Bücher mit neuen Texten versehen, und meinem Neffen eine Freude machen. Und so passierte erstmal lange nichts.
Bis sich vor ein paar Tagen plötzlich ein gewisser „Pronwan“ meldete und erzählte, dass er erfolgreich komplette Tip-Toi-Bücher (naja, Seiten) für seine Kinder gestaltet, gedruckt und mit eigenen Audio-Dateien versehen hat. Er hat das ganze in einem Video-Tutorial beschrieben. Fazit ist: Selber Drucken geht, wenn man die Bilder etwas blasser macht (schwarze Punkte auf schwarzem Grund geht schlecht) und gegebenenfalls erst das Bild mit einem Farb-Laserdrucker druckt und dann die die Punkt-Matrix mit einem Schwarz-Weiß-Drucker drüberlegt.
In dem Video kommt auch mein tttool
zum Einsatz, das – obwohl eigentlich eher für mich und für Leute, die das GME-Dateiformat besser verstehen wollen – dabei ganz gut wegkommt. Aus ist erwähnenswert dass „Pronwan“ Windows verwendet und vermutlich die von mir unter Wine vorkompilierte exe-Datei verwendet – die Platformunabhängigkeit von Haskell zahlt sich aus.
Das ganze ist wohl für einen interessierten Hobbisten wirklich recht gut zu machen. Ich bin gespannt was da noch alles kommt. Vielleicht gibts ja bald eine Webseite wo man hausgemachte Tip-Toi-Bücher sammeln und tauschen kann! Im Microcontroller-Forum gibts stets die neusten Infos.
Another on-my-the-journey-back blog post; this time from the Frankfurt Airport Train Station – my flight was delayed (if I knew that I could have watched the remaining Lightning Talks), and so was my train, but despite 5min of running through the Airport just not enough. And now that the free 30 Minutes of Railway Station Internet are used up, I have nothing else to do but blog...
Last week I was attending ICFP 2014 in Gothenburg, followed by the Haskell Symposium and the Haskell Implementors Workshop. The justification to attend was the paper on Safe Coercions (joint work with Richard Eisenberg, Simon Peyton Jones and Stephanie Weirich), although Richard got to hold the talk, and did so quite well. So I got to leisurely attend the talks, while fighting the jet-lag that I brought from Portland.
There were – as expected – quite a few interesting talks. Among them the first keynote, Kathleen Fisher on the need for formal methods in cars and toy-quadcopters and unmanned battle helicopters, which made me conclude that my Isabelle skills might eventually become relevant in practical applications. And did you know that if someone gains access to your car’s electronics, they can make the seat belt pull you back hard?
Stefanie Weirich’s keynote (and the subsequent related talks by Jan Stolarek and Richard Eisenberg) on what a dependently typed Haskell would look like and what we could use it for was mouth-watering. I am a bit worried that Haskell will be become a bit obscure for newcomers and people that simply don’t want to think about types too much, on the other hand it seems that Haskell as we know it will always stay there, just as a subset of the language.
Similarly interesting were refinement types for Haskell (talks by Niki Vazou and by Eric Seidel), in the form of LiquidTypes, something that I have not paid attention to yet. It seems to be a good way for more high assurance in Haskell code.
Finally, the Haskell Implementors Workshop had a truckload of exciting developments in and around Haskell: More on GHCJS, Partial type signatures, interactive type-driven development like we know it from Agda, the new Haskell module system and amazing user-defined error messages – the latter unfortunately only in Helium, at least for now.
But it’s not the case that I only sat and listened. During the Haskell Implementors Workshop I held a talk “Contributing to GHC” with a live demo of me fixing a (tiny) bug in GHC, with the aim of getting more people to hack on GHC (slides, video). The main message here is that it is not that big of deal. And despite me not actually saying much interesting in the talk, I got good feedback afterwards. So if it now actually motivates someone to contribute to GHC, I’m even more happier.
And then there is of course the Hallway Track. I discussed the issues with fusing a left fold (unfortunately, without a great solution). In order to tackle this problem more systematically, John Wiegley and I created the beginning of a “List Fusion Lab”, i.e. a bunch of list benchmark and the possibility to compare various implementations (e.g. with different RULES) and various compilers. With that we can hopefully better assess the effect of a change to the list functions.
PS: The next train is now also delayed, so I’ll likely miss my tram and arrive home even later...
PPS: I really have to update my 10 year old picture on my homepage (or redesign it completely). Quite a few people knew my name, but expected someone with shoulder-long hair...
PPPS: Haskell is really becoming mainstream: I just talked to a randomly chosen person (the boy sitting next to me in the train), and he is a Haskell enthusiast, building a structured editor for Haskell together with his brother. And all that as a 12th-grader...
I’m writing this blog post on the plane from Portland towards Europe (which I now can!), using the remaining battery live after having watched one of the DebConf talks that I missed. (It was the systemd talk, which was good and interesting, but maybe I should have watched one of the power management talks, as my battery is running down faster than it should be, I believe.)
I mostly enjoyed this year’s DebConf. I must admit that I did not come very prepared: I had neither something urgent to hack on, nor important things to discuss with the other attendees, so in a way I had a slow start. I also felt a bit out of touch with the project, both personally and technically: In previous DebConfs, I had more interest in many different corners of the project, and also came with more naive enthusiasm. After more than 10 years in the project, I see a few things more realistic and also more relaxed, and don’t react on “Wouldn’t it be cool to have crazy idea” very easily any more. And then I mostly focus on Haskell packaging (and related tooling, which sometimes is also relevant and useful to others) these days, which is not very interesting to most others.
But in the end I did get to do some useful hacking, heard a few interesting talks and even got a bit excited: I created a new tool to schedule binNMUs for Haskell packages which is quite generic (configured by just a regular expression), so that it can and will be used by the OCaml team as well, and who knows who else will start using hash-based virtual ABI packages in the future... It runs via a cron job on people.debian.org to produce output for Haskell and for OCaml, based on data pulled via HTTP. If you are a Debian developer and want up-to-date results, log into wuiet.debian.org
and run ~nomeata/binNMUs --sql
; it then uses the projectb and wanna-build databases directly. Thanks to the ftp team for opening up incoming.debian.org, by the way!
Unsurprisingly, I also held a talk on Haskell and Debian (slides available). I talked a bit too long and we had too little time for discussion, but in any case not all discussion would have fitted in 45 minutes. The question of which packages from Hackage should be added to Debian and which not is still undecided (which means we carry on packaging what we happen to want in Debian for whatever reason). I guess the better our tooling gets (see the next section), the more easily we can support more and more packages.
I am quite excited by and supportive of Enrico’s agenda to remove boilerplate data from the debian/ directories and relying on autodebianization tools. We have such a tool for Haskell package, cabal-debian, but it is unofficial, i.e. neither created by us nor fully endorsed. I want to change that, so I got in touch with the upstream maintainer and we want to get it into shape for producing perfect Debian packages, if the upstream provided meta data is perfect. I’d like to see the Debian Haskell Group to follows Enrico’s plan to its extreme conclusion, and this way drive innovation in Debian in general. We’ll see how that goes.
Besides all the technical program I enjoyed the obligatory games of Mao and Werewolves. I also got to dance! On Saturday night, I found a small but welcoming Swing-In-The-Park event where I could dance a few steps of Lindy Hop. And on Tuesday night, Vagrant Cascadian took us (well, three of us) to a blues dancing night, which I greatly enjoyed: The style was so improvisation-friendly that despite having missed the introduction and never having danced Blues before I could jump right in. And in contrast to social dances in Germany, where it is often announced that the girls are also invited to ask the boys, but then it is still mostly the boys who have to ask, here I took only half a minute of standing at the side until I got asked to dance. In retrospect I should have skipped the HP reception and went there directly...
I’m not heading home at the moment, but will travel directly to Göteborg to attend ICFP 2014. I hope the (usually worse) west-to-east jet lag will not prevent me from enjoying that as much as I could.
After a bit more than 9 years, I am replacing Serendipity, which as been hosting my blog, by a self-made static solution. This means that when you are reading this, my server no longer has to execute some rather large body of untyped code to produce the bytes sent to you. Instead, that happens once in a while on my laptop, and they are stored as static files on the server.
I hope to get a little performance boost from this, so that my site can more easily hold up to being mentioned on hackernews. I also do not want to worry about security issues in Serendipity – static files are not hacked.
Of course there are down-sides to having a static blog. The editing is a bit more annoying: I need to use my laptop (previously I could post from anywhere) and I edit text files instead of using a JavaScript-based WYSIWYG editor (but I was slightly annoyed by that as well). But most importantly your readers cannot comment on static pages. There are cloud-based solutions that integrate commenting via JavaScript on your static pages, but I decided to go for something even more low-level: You can comment by writing an e-mail to me, and I’ll put your comment on the page. This has the nice benefit of solving the blog comment spam problem.
The actual implementation of the blog is rather masochistic, as my web page runs on one of these weird obfuscated languages (XSLT). Previously, it contained of XSLT stylesheets producing makefiles calling XSLT sheets. Now it is a bit more-self-contained, with one XSLT stylesheet writing out all the various html and rss files.
I managed to import all my old posts and comments thanks to this script by Michael Hamann (I had played around with this some months ago and just spend what seemed to be an hour to me to find this script again) and a small Haskell script. Old URLs are rewritten (using mod_rewrite) to the new paths, but feed readers might still be confused by this.
This opens the door to a long due re-design of my webpage. But not today...
I have been a user of GNOME for a long time. I must have started using it in either 2000 or 2001, when LinuxTag was in Stuttgart. For some reason I wanted to start using one of the two Desktop Environments available (having used fvwm95 and/or IceWM before, I believe). I approached one of the guys at the GNOME booth and asked the question “Why should I use GNOME over KDE?”, knowing that it is quite a silly question, but unable to come up with a better one. He replied something along the lines of “Because it is part of GNU”, and that was good enough for me. Not that it matters a lot whether I use one or the other, but it was a way to decide it.
Back then GNOME was still version 1.2, with detachable menus and lots of very colorful themes – I first had something with thick yellow borders and then a brushed metal look. Back then, sawfish was the window manager of choice.
I used GNOME for many years. People complained when GNOME 2.0 came out, but I liked the approach they were taking: Simplicity and good defaults are a time saver! I did my bit of customization, such has having my panel vertically on the left edge, and even had a tool running that would react on certain events and make the window manager do stuff, such as removing the title bar and the borders from my terminals – naked terminals are very geeky (I forgot the name of the tool, but surely some will recognize and remember it).
In 2009 I got more and more involved in Haskell and stumbled over xmonad, a tiling window manager implemented and configured in Haskell. I found this a user interface that like a lot, so I started using it. This was no problem: GNOME happily let me replace the default window manager (metacity) with xmonad, and continue working. I even implemented the necessary support in xmonad so that it would spare out the gnome-panel, and that the pager (which displays the workspaces and windows) would work, and even interact with xmonad.
I was happy with this setup for a few more years, until GNOME3 came out. Since then, it has become harder and harder to maintain the setup. The main reason is gnome-shell, which replaces both gnome-panel and doesn’t work with any window manager but the new default, mutter. I want to use GNOME’s panel, but not its window manager, so I was stuck with a hardly maintained gnome-panel. I fixed what I could (with some patches applied upstream two years after submission, and some not at all) and lived with the remaining warts.
But a few days ago, GNOME 3.12 was pushed to Debian and I couldn’t even logout our shut down the computer any more, as gnome-session tries to talk to gnome-shell now to do that. Also, monitor configuration (e.g. remembering what setup to use when which monitors are attached) has been moved to gnome-shell. I tried to work around it a bit, but I quickly realized that it was time to make a decision: Either do it the GNOME way all the way, including using gnome-shell, or ditch GNOME:
Now as I said: I like the design and the philosophy of GNOME, including GNOME3, and I have told people who were complaining about it first to give it a try, and succeeded. I also tried it, but after years using a tiling window manager, I just couldn’t adjust to not having that any more. If xmonad could be changed to remotely control gnome-shell, I this might actually work for me! I think one of the biggest problems I had was to adjust to how gnome-shell handles multiple monitors. In xmonad, my workspaces are independent of the monitors, and I can move any workspace to any monitor.
So I had to ditch GNOME. My session now consists of a shell script making some adjustments (blank black background, loading the xmodmap), starts a few tools (taffybar, mail-notification, nagstamon, xscreensaver and dunst) and executes xmonad. So far it works good. It boots faster, it suspends faster.
I still use some GNOME components. I login using gdm (but it is auto-login, I guess I could try something faster), and gnome-keyring-daemon is also started. And I still use evolution (which has its own set of very disappointing problems in the current version).
Compared to my old setup, I’m still missing my beloved link-monitor-applet, but I guess I can implement an approximation to that in taffybar. The same for some other statistics like cpu temperature. I don’t have the GNOME menu any more, which I did not use regularly, but was useful occasionally.
The biggest problem so far is the lack of session management: I yet have to find a good way to logout and shutdown, while still giving Firefox time to finish without believing it crashed. Dear lazyweb: What is the best solution for that problem? Can systemd help me here somehow?
All in all I want to thank the GNOME guys for providing me with a great desktop environment for over a decade, and I hope I’ll be able to use it again one day (and hopefully not out of necessity and lack of viable alternatives).
When I gave my “Haskell Bytes” talk on the runtime representation of Haskell values the first time, I wrote here “It is in German, so [..] if you want me to translate it, then (convince your professor or employer to) invite me to hold the talk again“. This has just happened: I got to hold the talk as a Tech Talk at Galois in Portland, so now you can fetch the text also in English. Thanks to Jason for inviting me!
This was on my way to the Oregon Summer School on Programming Languages in Eugene, where I’m right now enjoying the shade of a tree next to the campus. We’ve got a relatively packed program with lectures on dependent types, categorical logic and other stuff, and more student talks in the evening (which unfortunately always collide with the open board game evenings at the local board game store). So at least we started to have a round of diplomacy, where I am about to be crushed from four sides at once. (And no, I don’t think that this has triggered the “illegal download warning” that the University of Oregon received about our internet use and threatens our internet connectivity.)
I’m writing this on the train back from the ZuriHac Haskell Hackathon in Zürich, generously sponsored by Better and Google. My goal for this event was to attract new people to work on GHC, the Haskell compiler, so I announced a „GHC bugsquashing project“. I collected a few seemingly simple ticket that have a good effort/reward ratio for beginners and encouraged those who showed up to pick one to work on.
Roughly six people started, and four actually worked on GHC on all three days. The biggest hurdle for them was to get GHC built for the first time, especially those using a Mac or Windows. They also had to learn to avoid recompilation of the whole compiler, which takes an annoying amount of time (~30 minutes for most people). But once such hurdles weren taken all of them managed to find their way around the source code to the place they need to touch and were able to produce a patch, some of which are already merged into GHC master. When I wasn’t giving tips and hints I was working on various small tickets myself, but nothing of great impact. I very much hope that this event will pay off and one or two of the newcomers end up being regular contributors to GHC.
We took breaks from our respective projects to listen to interesting talks by Edward Kmett and Simon Marlow, and on Saturday evening we all went to the shores of the Zurisee and had a nice Barbecue. It was a good opportunity to get into contact with more of the attendees (the hacking itself was separated in multiple smaller office rooms) and I was happy to hear about people having read my recent Call Arity paper, and even found it valuable.
Thanks to the organizers and sponsors for this nice opportunity!