Simple image filters with getUserMedia

I forgot to blog about this last week, but Justin made me remember. The WebRTC getUserMedia API is available on the Nightly and Aurora channels of Firefox right now and Tim has done a couple of great demos of using JavaScript to process the media stream. That got me interested and after a little playing around I remembered learning the basics of convolution image filters so I thought I’d give it a try. The result is a sorta ugly-looking UI that lets you build your own image filters to apply to the video coming off your webcam. There are a few pre-defined filter matrices there to get you started and it’s interesting to see what effects you can get. Remember that you need to enable media.navigator.enabled in about:config to make it work.

The downside is that either through me not seeing an obvious optimisation or JS just being too slow right now it isn’t fast enough for real use. Even a simple 3×3 filter is too slow on my machine since it ends up having to do 9 calculations per pixel which is just too much. Can someone out there make it faster?

Update: The ever-awesome bz took a look and pointed out three quick fixes that made the code far faster and it now runs almost realtime with a 3×3 filter for me. First he pointed out that storing frame.data in a variable outside the loops rather than resolving it each time speeds things up a lot. Secondly apparently let isn’t fully supported by IonMonkey yet so it switches to a slower path when it encounters it. Finally I was manually clamping the result to 0-255 but pixel data is a Uint8ClampedArray so it clamps itself automatically.

Six hour chilli

I’ve enjoyed making and eating chilli ever since I was in university. It has a lot of appeal for your average student:

  • You can make it in bulk and live off it for a few weeks
  • You can make it spicy enough that you start bleeding out of your nose (and thus totally impress all your friends)
  • It’s dead simple and really hard to screw up

The great part is that really once you have the basic ingredients you can just go nuts, vary up the ratios, add extra bits and pieces, whatever you fancy. Eventually though I discovered that I was getting a bit too wild and really wanted to nail down a basic “good” recipe to work from. It took a couple of tries but here it is. It’ll take around six hours to cook, an extra hour for prep depending on how lazy/drunk you are.

Ingredients

  • 2 cups chopped onions
  • 2.5 lb lean ground beef

 

  • 56 oz diced tomatoes (canned works just fine)
  • 2 tsp chilli powder
  • 2 tsp chilli pepper flakes
  • 1 tsp ground cumin
  • 1 tsp cayenne pepper
  • 1/8 cup cilantro leaves (I use dried but fresh is probably good too)
  • 1/2 tsp chipotle chilli pepper powder
  • 1/2 pint stout or other dark beer
  • 2 cups red wine (I tend to use merlot)
  • 2 cups beef broth
  • 1/8 cup tomato puree or 1/4 cup tomato ketchup
  • ~3-4 tsp coarsely ground black pepper

 

  • 3×12 oz cans of mixed beans
  • 2-3 large bell peppers (red and/or green), chopped

Instructions

  • Fry the onions in some olive oil until they start to turn translucent, then throw in the beef and stir until it browns all over.
  • Add into a large pot with the tomatoes, spices, wine, beer and broth
  • Cover and simmer for 2 hours, stirring periodically (use the remains of the stout to help you get through this)
  • Drain and add the beans
  • Cover and simmer for 2 hours, stirring periodically (it’s possible you’ll need more stout)
  • Add the chopped peppers
  • Cover and simmer for 1-2 hours, stirring periodically
  • Serve with cornbread and the rest of the bottle of wine

Tips

  • This makes a lot of chilli, make sure you have a large enough frying pan and pot. You can try scaling it down but I find it works best when making it in large quantities.
  • After adding the peppers you’re basically simmering till it is about the right consistency, uncovering and/or adding some corn starch can help thicken it up at this stage.
  • This recipe comes out pretty spicy, you might want to drop the chipotle chilli pepper power if that isn’t your thing.
  • Unless you are feeding an army make sure you have tupperware to hold the leftovers. It reheats very well, if using a microwave throwing a little extra water in helps.

After an awesome Jetpack work week

It’s the first day back at work after spending a great work week in London with the Jetpack team last week. I was going to write a summary of everything that went on but it turns out that Jeff beat me to it. That’s probably a good thing as he’s a better writer than I so go there and read up on all the fun stuff that we got done.

All I’ll add is that it was fantastic getting the whole team into a room to talk about what we’re working on and get a load of stuff done. I ran a discussion on what the goals of the Jetpack project are (I’ll follow up on this in another blog post to come later this week) and was delighted that everyone on the team is on the same page. Employing people from all around the world is one of Mozilla’s great strengths but also a potential risk. It’s vital for us to keep doing work weeks and all hands like this to make sure everyone gets to know everyone and is working together towards the same goals.

Where are we all now?

Quite a while ago now I generated a map showing where all the Mozilla employees are in the world. A few of the new folks on my team reminded me of it and when I showed it to them they wanted to know when I was going to get around to updating it. Challenge accepted! Back then the Mozilla phone book was available as a static html page so scraping it for the locations was easy. Now the phonebook is a prettier webapp so the same technique doesn’t quite work. Instead I ended up writing a simple add-on to pull the main phonebook tree then do requests for the json data for each employee, dumping them to a file on disk. Then some manipulation of the original scripts led to great success. Click through for a full zoomable map.

A map showing where Mozilla staff are in the world

Mozilla staff around the world

One of the criticisms I heard the last time I did this was that it didn’t contain any information about the rest of the community. I’ll say basically the same thing that I said then. Get me the data and I’ll gladly put it on a map for all to see, but right now I don’t know where to find that information. I think this is something that Mozillians could do but I’m not sure what the chances are of getting everyone in the directory to add that info to their profile. I think this information is still useful even without the community as it demonstrates how global we are (and in some cases aren’t!) as a company.

Managing changes is the key to a project’s success

TomTom made an interesting claim recently. Their summary is “when it comes to automotive-grade mapping, open source has some quite serious limitations, falling short on the levels of accuracy and reliability required for safe navigation

This is a bold claim and they talk about recent studies that back them up. Unfortunately none of them are referenced but it’s pretty clear from the text of the article that all they are doing is comparing the accuracy of TomTom maps with existing open source maps. So they’re just generalising, this doesn’t prove a limitation with the open source process itself of course, just perhaps of a particular instance of it.

In fact having read the article I think TomTom are just misunderstanding how open source development works. Their basic complaint seems to be that open source maps are entirely community generated with no proper review of the changes made. In such a situation I’m sure the data generated is always going to be liable to contain errors, sometimes malicious, sometimes honest. But that isn’t how open source development works in general (I make no claim to know how it works for open source mapping). I’d probably call such a situation crowd-sourcing.

Successful open source projects rely on levels of management controlling the changes that are made to the central repository of the source code (or in this case mapping data). In Firefox for example every change is reviewed at least once by an expert in the area of the code affected before being allowed into the repository. Most other open source projects I know of run similarly. It’s this management that is, in my opinion, key to the success of the project. Clamp down too hard on changes and development is slow and contributors get frustrated and walk away, be too lenient and too many bugs get introduced or the project veers out of control. You can adjust the level of control based on how critical the accuracy of the contribution is. Of course this isn’t some special circumstance for open source projects, closed source projects should operate in the same fashion.

The part of their post that amuses me is when they say “we harness the local knowledge of our 60 million satnav customers, who can make corrections through TomTom Map Share“. So basically they accept outside contributions to their maps too. As far as their development goes it sounds like they function much like an open source project to me! The only claim they make is that they have better experts reviewing the changes that are submitted. This might be true but it has nothing to do with whether the project is open source or not, it’s just who you find to control the changes submitted.

There is of course one place where open source is at an arguable disadvantage. The latest bleeding edge source is always available (or at least should be). If you look at the changes as they come in, before QA processes and community testing has gone on then of course you’re going to see issues. I’m sure TomTom have development versions of their maps that are internal only and probably have their fair share of errors that are waiting to be ironed out too. Open source makes it perhaps easier to end up using these development versions so unless you know what you’re doing you should always stick to the more stable releases.

Just because a project accepts contributions from a community doesn’t mean it is doomed to fail, nor does it mean it is bound to succeed. What you have to ask yourself before using any project, open source or not, is how good are those controlling changes and how many people are likely to have reviewed and tested the end result.

 

Mossop Status Update: 2012-05-11

Done:

  • Submitted pdf.js packaging work for review (bug 740795)
  • Patched a problem on OSX with FAT filesystem profiles (bug 733436)
  • Patched a problem with restartless add-ons when moving profiles between machines (bug 744833)
  • Added some quoting for the extensions crash report annotation (bug 753900)
  • Thoughts on shipping the SDK in Firefox and problems with supporting other apps: https://etherpad.mozilla.org/SDK-in-Firefox

I made a thing to help with GPS in Lightroom

Yep, this post is totally not Mozilla related so I’ll keep it short, but a lot of people in Mozilla do take great photos and maybe they are stuck in my position: No actual GPS device and an compulsion to try to correctly GPS tag their vast collection. I put it off for a long while but finally this weekend wrote a Lightroom plugin to help ease a lot of the manual labour. Go check it out if you’re interested. It’s even on github!