Japan Tsunami/Earthquake: Internet, damn resilient

So, every other means of communication went down. Internet is what still works. Why? Well, it was designed to be resilient, to survive nuclear war.

I hope that this can stop the downplaying and destruction of the open, free, distributed internet.

What makes the Internet great, is this distributed design. Politicians want to remove that, in order to stop child porn on the web. That must not happen, because first of all, it won’t really help, and secondly, the Internet’s free role and non-centralized design helps freedom, combats cencorship, and in this case, allows for resilient systems that can easily be extended.

Maybe, instead of wanting to censor the internet, we want to make it even easier to make backbones to the internet. Ghettonet.

If interested in this, you may want to read about Freedom box;


Actually, seeing this crisis made me donate money to the FreedomBox foundation.

100% computer geek, be very afraid

So, apparently:

My computer geek score is greater than 100% of all people in the world!

And they also say;

Your computer geekiness is:

Step aside Bill Gates, Linus Torvalds, and Steve Jobs… You are by far the SUPREME COMPUTER GOD!!!

Ohwell. I only answered truthfully to all the questions, and actually though “fsck, I won’t get full score on that question, but cheating is no fun”. And then I get a score that’s 100% better than everyone else.

Like, I do actually have a girlfriend, and she is not geeky, etc.

Anyway, it’s a stupid test. A bit funny at times though ;-)

A bit sad getting that score as well though. I should really spend less time on the computer.

My computer geek score is greater than 100% of all people in the world! How do you compare? Click here to find out!

But I also like pretty things, – while that picture looks really horrible.

Chrome drops H.264 (yay!)

So this means “my” way of doing web streaming of conferences with Theora (and in future, maybe VP8, but right now VP8 support == Theora support) is a good fit. I hope so much that VP8 wins the war. It´s obviously very important for the free software community as, well, developers can´t be expected to pay royalities for their users.

I know readers of this blog already know the counless evils of software patents, so I won´t go in to that. Suffice to say, Google is doing a great thing for the openess of the web here. You take that, evil Apple!

I learned about it in this great article, Google Hands Open Video a Huge Win

Send files via SSH/SCP with Nautilus

I wanted a quick and easy way to share files, images, scripts etc. I searched for this about two years ago, but couldn’t find it. So I wrote my own, and wrote about it on my norwegian blog.

Recently Omg! Ubuntu! wrote about a Nautilus Imgur-uploading script, and I wrote a small comment saying that I’ve used the same for some time, only to my own servers and using SCP. I said it also supported FTP, but I don’t know where I took that from, it doesn’t and I have no need for FTP :-)

Anyway, this is how it works:

  1. Right click file (or files!) in Nautilus
  2. Choose script -> Send to Thor (where Thor is the name of a server)
  3. Wait a bit, and you will get a small notification saying that the file is uploaded
  4. To make it easy to share, the script sets your selection clipboard to the url, so that you can just paste away

I use this directory structure, mostly:


You may download the send_via_scp script here.

I’m also pasting in this version of the script, just so that it doesn’t disappear.

[code lang=”bash”] #!/bin/bash # Send your files to another machine using SCP # REQUIRED # – Nautilus (doh!) # – scp # – notify-send (from package notify-bin, libnotify – for messages) # – xclip (to automaticaly set your clipboard to the URL of the file) # To use (for single server): # 1. Copy to ${HOME}/.gnome2/nautilus-scripts # 2. Rename to "Send to MYSERVER" and make executable (chmod a+x) # 3. Change the options below # # To use (for multiple servers): # 1. Copy to your ${HOME}/.gnome2/nautilus-scripts directory # 2. Make a file named "Send to MYSERVER" in the same directory # 3. Put this into it: # ————— # #!/bin/bash # # SCP_URL="odin@odin.s0.no:public_html/rot/`hostname`/" # HTTP_URL="http://odin.s0.no/rot/`hostname`/" # PROGNAME=`basename "$0"` # dir=`dirname "$0"` # # export SCP_URL HTTP_URL PROGNAME # source $dir/send_via_scp # ————— # # 4. Make it executable (you don’t have to make send_via_scp executable) # Author: Odin Hørthe Omdal < odin.omdal at gmail [dot] com > # Version: 1.1 # Based upon the script to make symbolic links by # Author: Jon Green < g-scripts [at] green-lines [dot] com > # Version: 1.0 if [ "x$SCP_URL" == "x" ]; then # OPTIONS, edit these SCP_URL="user@domain:public_html/folder/" HTTP_URL="http://domain/folder/" # This is the name it will use for notifications PROGNAME=`basename "$0"` fi #### You don’t have to edit below here ### warning() { notify-send -i gtk-dialog-error "$PROGNAME" "$*" } message() { notify-send -i gtk-dialog-info "$PROGNAME" "$*" } declare -a NAUTFILES export IX=0 while read FILE; do if [ "x${FILE}" != "x" -a "x${FILE}" != ‘x"’ ]; then NAUTFILES[${IX}]="${FILE}" IX=$[ ${IX} + 1 ] fi done <<EOF ${NAUTILUS_SCRIPT_SELECTED_FILE_PATHS[@]} EOF if [ 0 -eq ${IX} ]; then URI="${NAUTILUS_SCRIPT_CURRENT_URI}" METHOD="${URI:0:7}" if [ "file://" == "${METHOD}" ]; then NAUTFILES[0]="${URI:7}" IX=1 fi fi if [ 0 == "${#NAUTFILES[@]}" ]; then warning "Nothing to do" exit fi for FILE in "${NAUTFILES[@]}"; do FILE_NAME=`basename "${FILE}"`; if ! scp "${FILE}" "$SCP_URL"; then warning "Couldn’t send ${FILE_NAME}" else message "Uploaded to <a href=’${HTTP_URL}${FILE_NAME}’>${HTTP_URL}${FILE_NAME}</a>" fi # Save the URL to the clip board echo "${HTTP_URL}${FILE_NAME}" | xclip done [/code]

At FrOSCon with Lumiera

It seems rather long ago now, but it really isn’t. Not this weekend, nor the last, but the one before that again me and my girlfriend Helene visited FrOSCon to meet up with parts of the Lumiera-community.

We were there mostly to recruit developers, as we’re kinda short on that front. Everybody is always wanting to recruit more developers, but other projects also have a much smaller community and user base waiting for it, than Lumiera has. So they also need to show themselves and get people to use the project. Sadly, we’re not in that position yet, as Lumiera still is not able to do a single cut.

However, that was always the plan anyway, to do things correctly, fast and extensible. That is no small task when we’re talking about video editing. Lumiera is also built so that it can do rather complex compositing, and do so in a smart fashion – another point on the “it’s a big project”-list.

Anyway, we got some considerable interest. I talked with a cool OpenMoko-guy that’s primarily into «low level» stuff. Just what we need. What’s even better, he was interested in helping. Sadly I haven’t found him on the net now after we left in a hurry to catch our plane.

I’m also going to help on uWiki so we can get it finished and hopefully accelerate the growth of the Lumiera code base.

A free, open source NX server from google; Neatx

Helping people from afar, it’s always nice to be able to log on and see their screen. It’s also cool to be able to log in on your desktop (not server, we use SSH for that) wherever you are. Ohwell, I always run SSH on my computers anyway, but my dad has two computers and wants to use one or the other graphically.

Right now I use VNC (vino-server), but it’s truly dead slow. I’ve seen other people using NX from NoMachine and I’ve been impressed. I tried it myself (the FreeNX-server, that is), but I was never able to make it work.

Google also saw that the FreeNX server was hard to maintain, so they started a FOSS NX server from scratch; Neatx. Great!

It’s still not «released» for the general public, but I’m planning on keeping my eyes on it. Thought I’d just give people a heads up in case they’ve also tried FreeNX and couldn’t really make it work.

Update 25th of July, I thought they made a release because of the LWN headline but still not. Plenty of good information in the LWN article (as always!) though: Google releases Neatx NX server

Hanging machine with NFS home folders; an unexpected fix

So I got complaints that one of the Ubuntu machines at work couldn’t even write an email without hanging and Thunderbird/Firefox getting all gray/dark. Incredible frustrating, I know after using it a few minutes.

This was just adding up to the negative experiences some of my co-workers have with my Ubuntu Linux requirement option for them. Of course, unhappy coworkers make for a very un-nice working environment for me. Problems with Ubuntu Linux will pretty quick accumulate in everyone suggesting we just buy a boatload of licenses from Microsoft. We all know all the faults with doing such a thing, so I’m resisting it. Especially since there shouldn’t be a reason we have to use Windows. But these cases are really real problems.

So I’ve been kinda bummed about this the last month. I’ve used much time trying to find the problem. Then while I was in bed last night I remembered that I stole a switch from one of the rooms and replaced it with a HUB saying to my coworkers while leaving:

Now you’ll burn with extreme low network performance, MOHAHAWAHWAHWHA. I’ll return it shortly, kthxbye.

And now, zoom two months forward, still not replaced. And the switch I took is dusting away in a box after I was finished with it.

Coming to the office, I saw immediatly I was right. There it was sitting, that shitty HUB. I can’t believe that anyone actually used such crappy equipment once upon a time. HUBs really do suck.

It was fun exactly 5 minutes to see the collision-light on the HUB and Firefox getting all dark gray in sync, staying like that for a few seconds. After I switched in the switch (;-)) performance peaked again. Ahh. Nice. So now I can continue rolling out Ubuntu machines (yay).

I was very happy about this, as I said on my norwegian microblog:

JAAA! EG FANN FEILEN! Eg sa NFS saug, maskini her hang seg ofte. Men eg fann ein HUB i netverket til @neitileu! DOH DOH DOH! Kollisjon!

Translated that is: YEES! I FOUND THE ERROR! I said NFS sucked, the machine here hung often. But I found a HUB in no to EU’s network! DOH DOH DOH! Collision!

Firefox 3.5 (or Swiftweasel) + Video is HOT

I’m so extremely thrilled that Firefox 3.5 will be getting so incredible nice video support.

I took a small clip from my Nikon D90, and lo’ and behold. Did a quick ffmpeg2theora (yes, I know it sucks right now, but Thusnelda will fix that, and I couldn’t be bothered to dowload the GREAT improvements on Theora lately), and used the wordpress upload function. Of course, WordPress is rather stupid, so I had to write the tag myself, but damn that is ever so easy.

Anyone think this is as easy as embedding pictures? You’re damn right! I’m so thrilled with this. You can see my small clip of the river outside my bedroom and my girlfriend talking/singing on my norwegian blog.

Grreat! If you use Swiftweasel 3.5 (quick as hell!) or Firefox 3.5 you’ll be able to see the video with no problem. And it’s actually very nice to use. Much better than any of the video solutions I’ve seen before. It’s even better than the flash-players we’re used to seeing (because you can easily download the video).

Sorry that markdown plugin wasn’t on Habari for a short time, I’ve installed it again now. :-)

The Lena standard test image, full version (!)

Wow. I just stumbled upon the Lena test image (I was looking for restoration techniques). Every single person having an interest in digital image compression has often seen this picture. I’ve seen it in my text-books and on the net long before I even knew how to program.

It’s a beautiful picture, and I have actually thought about where it’s from many times. Because this picture had a caption saying this was «Lena», I started searching and reading up on the picture.

Turns out it’s from Playboy November 1972! And it’s a crop! That’s an interesting turn.

So, after scouring the web, I found the full image (it wasn’t really that hard either).

I always thought that picture had a more painted feel to it, now I know it’s the scanner they used to scan it. Because the real picture looks more, uh, real. I actually prefer the colors in the cropped Lena-picture. But that may as well be because I love the picture after seeing it so often (and think of it as the real version).

So, here it is, in all it’s glory.

So, there’s a bit of added trivia. I’ve used much time on this now, should start doing something else. :-)

Installing (compiling) DVswitch on Ubuntu

DVswitch is a great program, where you can connect a few laptops in a network, and have a DV-camera connected to each of them. Then, on one computer you use the main dvswitch-program and you can mix all the videosources easily. It’s really a great program! I’ll come back with a more thorough walkthrough (well, guess I’ll just make a video of it ;-) )

I just need this as a quick reminder, so I don’t have to run cmake 10.000 times to find out the dependencies.

sudo aptitude install build-essential libavformat-dev libavcodec-dev libavutil-dev \
libtheora-dev libogg-dev libvorbis-dev libavc1394-dev libasound2-dev subversion \
cmake libgtkmm-2.4-dev libboost-thread-dev libxv-dev

Then do a checkout

svn co svn://svn.debian.org/dvswitch/

Check it with cmake (and remove that cache-file between each try, — there shouldn’t be more than one now anyway)

rm CMakeCache.txt ; cmake .

That dot is important. And if it’s all good, compile it:


Then make the share-directory:

sudo ln -s pwd/data /usr/local/share/dvswitch

And that should be it. You might want to put the src-dir in the path as well.

Musings about video, html, tech and linux