Realtek 8188eu-based Wi-Fi adapters on the Raspberry Pi

Raspberry Pi enthusiasts like to use Realtek 8188eu-based Wi-Fi adapters. Known as “micro” Wi-Fi adapters, they are small and inexpensive. For example, the TP-LINK WN725N V2 can be found for under $20 (the V1 and V3 use different chipsets. Thanks for making everything confusing, TP-LINK!).

As of October 2014 and Linux 3.12.29+, the driver for this Wi-Fi adapter isn’t available in the Raspberry Pi Linux kernel, so these Wi-Fi adapters are not supported out-of-the-box, despite otherwise you may find on the Internet. Definitely, they don’t work in a plug-and-play manner.

You can get these adapters to work by using an out-of-tree driver (i.e. a driver not included with the Linux kernel). People on the Internet have resorted to doing all these silly things like downloading random kernel modules or compiling the vendor Linux kernal trees from scratch trying to get this driver to work. You’re probably reading this article because you, like me, thought it was silly to chase files around on some random Internet forum.

The source code for the 8188eu driver is available on GitHub. Rather than trying to look for random files and hope they work, you can keep the source for the module on your Raspberry Pi and recompile the module as needed. Below are copy-and-pastable instructions to do this on Raspbian.

Make sure you have a compiler, etc installed.

sudo apt-get install build-essential git

Download the source code for the driver. I prefer to put source code for stuff like this somewhere convenient in my home directory, like ~/src/drivers/.

mkdir ~/src/drivers
cd ~/src/drivers/
git clone https://github.com/lwfinger/rtl8188eu.git

Now you have the source code for the driver, you need to get the source code for the kernel running on the Raspberry Pi. Finding the source code for the currently-running kernel on the Raspberry Pi can be a pain. I like to use Hexxeh’s rpi-source tool, which you can download, install, and run:

sudo wget https://raw.githubusercontent.com/notro/rpi-source/master/rpi-source -O /usr/local/bin/rpi-source && sudo chmod +x /usr/local/bin/rpi-source && /usr/local/bin/rpi-source -q --tag-update
sudo rpi-source

If you get an error about GCC, use the “skip-gcc” flag:

sudo rpi-source --skip-gcc

Now, compile the driver:

cd ~/src/drivers/rtl8188eu
sudo make all
sudo make install

Your Wi-Fi should be working, though you’ll need to configure it (configuring it is out of scope for this article).

You can double check whether the adapter is recognized by looking at the output of (assuming you’ve only one Wi-Fi adapter, it will be named wlan0):

sudo iwconfig wlan0

If you see nothing, the driver may have been installed wrong or your Wi-Fi adapter not working. If you see something like:

wlan0     unassociated  Nickname:"<WIFI@REALTEK>"
          Mode:Master  Frequency=2.412 GHz  Access Point: E8:DE:27:9E:65:D7   
          Sensitivity:0/0  
          Retry:off   RTS thr:off   Fragment thr:off
          Power Management:off
          Link Quality=0/100  Signal level=0 dBm  Noise level=0 dBm
          Rx invalid nwid:0  Rx invalid crypt:0  Rx invalid frag:0
          Tx excessive retries:0  Invalid misc:0   Missed beacon:0

Then you are good to go!

All of this seemed like a lot of work compared to hunting down the file, right? Except, it isn’t: when you update the kernel on your Raspberry Pi and reboot, the Wi-Fi will stop working because it will no longer have an appropriate driver. Rather than having to go on a witch hunt for files, you can simply recompile the driver again while offline:

cd ~/src/drivers/rtl8188eu
git pull # Run this if you have Internet access even though your Wi-Fi is down
make all
sudo make install

The above is all you need to turn your Raspberry Pi into a working Wi-Fi client.

If you want to use your shiny new Wi-Fi enabled Raspberry Pi as an access point, there’s a gotcha with Realtek Wi-Fi adapters (I’ve found this true of both the 8188eu and 8192cu chips): they won’t work with hostapd as included with Raspbian or Arch Linux. You’ll need to use a custom hostapd, such as this one posted by Jens Segers, to get a working access point.

Setting up package caching for pip and npm

If you’re on a metered Internet connection (like me), downloading and re-downloading packages to setup your Python virtualenvs and Node.js projects isn’t very fun. Besides wasting data, it’s slow.

These package managers actually have caching built-in, though it needs to be configured. Follow the instructions below to configure Python’s pip, Node.js’ NPM, and Twitter’s Bower. I assume you’re on Linux and want to use XDG-compliant directories.

Python pip

Configuring a cache directory for pip will cache both source and wheel-based packages.

cat >> ~/.pip/pip.conf << EOF
[global]
download_cache = ~/.cache/pip
EOF
mkdir -p ~/.cache/pip

pip’s package cache caches things by URL, not by package, which makes pip’s caching mechanism not work with mirrors. The pip manual describes a mechanism for fast and local installs if this a problem for you.

Node.js NPM

NPM does cache packages by default, but in a non-XDG compliant directory. Reconfigure it as so:

npm config set cache ~/.cache/npm
mkdir -p ~/.cache/npm
mv ~/.npm/* ~/.cache/npm/
rm -rf ~/.npm/

See NPM’s documentation for package caching for further detail.

Twitter Bower

Bower, like NPM, also caches packages by default in a non-XDG compliant directory. Reconfigure it, and create a .bowerrc:

bower cache-clean
cat >> ~/.bowerrc << EOF
{
  "cache": "$HOME/.cache/bower/"
}
EOF
mkdir -p ~/.cache/bower

None of pip, npm, or bower do any purging of outdated or unused packages. You will have to manage this yourself, unless you are OK of these caches growing without bound (disk space is cheap these days).

Attempting to manage these caches is tricky: since modern UNIXes no longer update last accessed times (aka atime) on files, you can’t purge packages on last use. I leave figuring out the best way to purge these directories as an exercise to the reader (please leave a comment!).

First post!

I’m rebooting this blog. It’s a long time coming.

My first “blog” was back in 2003. As was fashionable back then, I wrote my own, custom blogging software in PHP. In 2005, I ditched the custom software and started using Drupal, starting with Drupal 4.5.

I become fairly skilled with Drupal: I wrote themes, modules, launched several Websites for myself and clients, ran a Drupal-oriented consulting firm and Web hosting company, and managed scalability at a startup built on Drupal.

I, very painfully, upgraded those sites and this blog, which started with Drupal 4.5, all the way up to Drupal 7. It was not easy or fun.

I had to constantly deal with security upgrades. Upgrading to new Drupal versions required a witch hunt to find which 3rd party modules were deprecated, which new modules I should use to replace them, and in what ways those modules worked differently. And then there was all the spam—even with Acquia’s Mollom service, too much spam got through, requiring constant moderation.

The thing that did Drupal in for me was SA-CORE-2014-005, fixed in Drupal 7.32. After over 10 years of following Drupal security updates, I missed one, by just a few hours—and my site was hacked. While restoring from my backups, I made a mistake and deleted the most recent one.

If I wasn’t turned off to PHP Web applications before, I’m really, really turned off to them now. After using Drupal for almost 10 years, I’m switching to something else (at least for my personal stuff).

Because I didn’t want to deal with upgrading or security issues, it meant using something like Ghost or Wordpress was out.

The new Samat Says is powered by Nikola, a static site generator. I looked at Jekyll, Blogofile, Pelican, but Nikola seemed the easiest for me to understand, extend, and most importantly has a vibrant, active community.

For a long time, I cared about owning my comments and not being tied to a 3rd party service. After dealing with all the spam my old Drupal site received, I now care quite a bit less. I’d like to move to a self-hosted, or better yet federated, commenting solution in the future but for the time being I’m using Disqus. I haven’t figured out how I will migrate old comments yet.

Worth mentioning: you may have noticed this isn’t an actual first post. I’m slowly migrating content from the old blog (probably only the popular bits) from my old Drupal installation over to this Nikola-powered one.

Weekend irony with the University of Florida

Irony: this weekend (Apr 21–22), NPR’s Wait Wait˙ Don’t Tell me! re-ran a segment with author Jack Gantos. In it, Jack Gantos makes a crack at the University of Florida:

I drove up to University of Florida. It looked just like my high school — a giant football facility with a small academic institution

Apparently, University of Florida thought this was a compliment. Forbes reports (via) that University of Florida has eliminated research & graduate work in its Computer Science department, while simultaneously significantly increasing funding for athletics.

A clarification missing from the Forbes article: UF is eliminating research and graduate work in its computer science department, just like it did for its nuclear engineering department the year prior. The departments will remain as severely gimped teaching-only undergraduate departments… not unlike a glorified community college.

Keep it real University of Florida!

Spaceport America on OpenStreetMap

[flickr-photo:id=6639706995,size=m]

Spaceport America is “the world’s first purpose-built commercial spaceport”. Wonder what it looks like? You can now find it on OpenStreetMap, one of the many things I’ve been mapping in New Mexico’s barren & isolated Jornada del Muerto. I’ve indicated various Spaceport America structures, like the state-of-the-art Terminal Hangar Building and Spaceport Operations Center. I’ve yet to accurately locate Spaceport America’s vertical launch pad, which has been in use since 2007.

No, it’s not on Bing Maps, Google Maps, or any of the other Web mapping competitors—just in case you needed a reason why crowd-sourced geodata (or VGI) can’t be beat.

Want an aerial photo of Spaceport America? Over on Flickr I’ve a screencap from the USDA’s public-domain NAIP 2011 release, pretty much the only source for high-resolution imagery of the middle of nowhere.

See Spaceport America on OpenStreetMap

A street without rules, a safer street?

A few weeks ago, I started listening to another podcast: WHY? Philosophical Discussions About Everyday Life, hosted by philosopher Jack Russell Weinstein.

Listening through the backlog, I found an excellent show, Episode 28: “On Liberty and Libertarianism” with guest James Otteson. In it, Jack and James philosophize about so-called “Libertarianism”, talking about how government should relate (or not relate) to both social and moral issues.

One of James’ fantastic talking points was on traffic. In short, all the rules and regulations that both drivers and pedestrians must follow are dehumanizing. Destroying the human connection between driver and pedestrian takes the social issue of road sharing and turns it into… well, something else, where drivers and pedestrians no longer need to think—it becomes a matter of just reading signs, staying within lines, and blindly following the guidance of blinking lights.

In the show, they discuss a Finnish town with a high number of traffic accidents. The town removed traffic lights, stops signs, and other regulatory sundries and traffic accidents went down.

They’re beginning to do the same on London’s Exhibition Road in the UK:

The idea is that when driving zones are heavily delineated, drivers tend to be on autopilot, focusing on other cars rather than pedestrians or cyclists. That’s why London has so many guard rails on either side of pedestrian crossings, preventing pedestrians from straying into the road where they’re not supposed to. But 10 years ago, Kensington and Chelsea experimented with removing the railings from Kensington High Street and found that the number of pedestrian accidents dropped by 60%. It seems that when drivers are forced to be more aware and pedestrians are forced to take more responsibility for themselves, everyone is safer. Rules, it seems, were counterproductive.

Interestingly enough, The Guardian publishes this in the Arts & Design section and describes the movement as liberal. In my opinion, it’s anything but. Leave it to the Europeans to re-pioneer freedom & common sense.

Upload multiple photos to Meetup without Flash

Do you use Meetup and upload multiple photos regularly, but hate doing it?

Meetup has an multiple-photo feature that uses Adobe Flash, but as you can expect of Flash it’s not particularly reliable or stable. There no reason to put up with Flash’s nonsense—HTML5 includes a multiple-file upload control, well-supported by the latest Web browsers. Unfortunately, despite my posting a wishlist item (please vote!), Meetup has done nothing to support it.

So I did it myself: if you use Chrome or Greasemonkey/Scriptish for Firefox, install this user script: Meetup: HTML5 multiple-file upload for photos.

Once installed:

  1. Go to the “Old Upload Form” for your Meetup group or album. This can be tricky to get to, but the URL looks like: http://www.meetup.com/GROUPNAME/photos/upload/
  2. Make sure you’ve selected the right album.
  3. You should only see one file upload widget (the “Old Upload Form”, before this script, had 10).
  4. Click it, and you’ll notice you can select multiple photos you want to upload. Go ahead and do so.
  5. After you’re done selecting and dismiss the widget, the page will now tell you which photos you’ll be uploading.
  6. Click upload to start uploading photos.

Enjoy uploading your photos without Flash’s crashing, errors, or mayhem!

A note: if you use Firefox, you won’t be able to know how much you’ve uploaded (unlike Chrome, Firefox has no built-in upload progress meter). Try the Upload Progress add-on to keep tabs on your uploads.

Changing Drupal 7’s built-in jQuery UI theme

jQuery UI, a Javascript widget framework built upon jQuery, comes built-in Drupal 7 core. One of jQuery UI’s nicer features is that you can switch themes by changing out a CSS file.

There are some nice jQuery UI themes out there (unfortunately, not enough!), like Tait Brown’s port of Aristo to jQuery UI (see demo).

But since jQuery UI is in Drupal core, which internally keeps track of CSS files, how do you switch the jQuery UI theme in use?

The Seven theme, including with Drupal core, provides inspiration on the “one true Drupal way” of doing this, by providing hook_css_alter(). Place into your theme’s template.php:

function MYTHEME_css_alter(&$css) {
  if (isset($css['misc/ui/jquery.ui.theme.css'])) {
    $css['misc/ui/jquery.ui.theme.css']['data'] = drupal_get_path('theme', 'MYTHEME') . '/jquery.ui.theme.css';
  }
}

Replace “MYTHEME” with the name of your theme, and adjust the path to your jQuery UI theme’s CSS file accordingly (the above assumes you place jquery.ui.theme.css in the root folder of your theme).

With this magic in hand, I now have the Aristo jQuery UI theme running on this blog. Looks quite a bit better!

This post was inspired by an answer I posted on StackExchange.

BrowserID session API support for Drupal

Late last week, Mozilla’s Identity team made available a Firefox extension for BrowserID, a new browser-oriented single-sign on mechanism. Click a button in your address bar and automagically login into a website.

Along with it they made available a browser session API—that is, the browser can now keep track and show whether you’re logged in, logged out, etc, also displayed in your address bar.

Drupal had a BrowserID module less than 24 hours after BrowserID’s initial announcement (thanks Isaac Sukin!). Likewise, in the weekend after the session API announcement I helped out and wrote a patch adding support for the new API.

If you’re familiar with Drupal development, install the Drupal module, apply the patch, install the Firefox add-on, and get browser-integrated, one-click login to your Drupal-powered website.

The patched module is running live on this site right now, so please play with it (myfavoritebeer.org does get boring).

At the moment, Drupal’s BrowserID module does not create an account on my blog, so you must do that first, separately. Create an account here, or if you’ve an OpenID, logon with your OpenID directly to also create an account (funny how complicated this has gotten already). Make sure to set and use the same e-mail address as the one you use for your BrowserID. After creating an account, logout, and then log back in using your BrowserID. If you’ve problems/find a bug, please leave comments on the Drupal bug or this blog post—thanks!

[UPDATE: 16 Aug 2010]: Drupal’s BrowserID module now includes my patch; you don’t need to download and apply the patch separately.