nRF24 walk through – Introduction

The Nordic nRF24 is a family of silicon integrated radio transceivers operating in the 2.4GHz band, the most popular one being the nRF24L01. This is the core element of some extremely cheap module boards available in online stores like eBay, Aliexpress and Banggood.

These boards do not provide WiFi (801.11) or Bluetooth connectivity (both in the 2.5GHz band), but they can be used to establish custom wireless networks between small electronic devices, including Arduino, RaspberryPi and Particle (formerly known as Spark).

Whenever we talk about networks you must take in account a few key aspects of networking, one of the most important being the network topology.

NetworkTopologies.svg

During this series we will aim to establish a star network between a series of Arduino based peripheral nodes and a central hub node, being either a RaspberryPi or a Particle Core/Photon: this represents a basic but invaluable configuration allowing for complex elaborations on remotely collected information, either on premise (RPi) or in the cloud (Particle).

If this series gets enough attention I might invest some more time and extend it to cover more complex topologies like tree and mesh, with the latter being my favorite and, IMHO, most valuable for inexpensive IoT projects.

The project

To keep things simple our peripheral nodes will be only collecting button presses, communicating to the central hub whenever a button gets pressed: the central hub will periodically (once every 30 seconds) print out the amount of button clicks it has received with a breakdown for each node; something like:

Received 14 clicks in the past 1 minute(s)
* 5 click(s) from node A
* 2 click(s) from node B
* 7 click(s) from node F

This will obviously represent just an example of what you will be able to do from the hub node; nothing prevents you, as an example, from pushing data into a database and generating graphs. You could aggregate the data differently or, more likely, collect other types of data from your sensor nodes: I’m not here to place constraints to your imagination!

Keep in mind though, the little radio transceivers we are using have a few limitations that are commonly misunderstood, which will be analyzed when we get there during the project.

The steps

This walk through will be split into the following posts:

The SparkCore: makers meet the cloud!

Ok, I’ve recently got caught into the makers fever, I admit it.

Now, I’ve recently got two Spark Cores (those little devices are amazing!) I intend to use for a project of mine, but the nRF24L01 library I need has not been ported yet for this little beast.

Hey, this is the open source and open hardware ground: let’s contribute!

Sadly setting up the non cloud based environment on my Windows 7 64bit laptop was not that easy and it took me about an entire saturday morning: what follows is the summary of my experience in a step-by-step guide.

Step 1: get the base software

Start by downloading the following pieces of software and install/uncompress them all but Zadig (it doesn’t need to) and the Spark Core Serial Driver (you can’t until we reach step 4) using an installation folder of your choice, but avoid anything like the Program Files (x86), those brackets and spaces will drive you crazy:

  • GCC ARM ver. 4.8
  • Make 3.81
  • DFU-util ver. 0.7 binary
  • GIT ver. 1.8.5
  • Atlassian SourceTree (Optional)
  • Eclipse CPP Kepler x86_64 (I’m an Eclipse fan)
  • Zadig ver. 2.0.1
  • Spark Core Serial Driver

Please note it would be much better if you install these softwares into a path that doesn’t contain special characters and/or spaces: my advice is to use something like C:\Spark or something like that.

Step 2: set environment variables

To ease development it’s going to be best if some binaries end up into the system path, like make, gcc and dfu-util. My user PATH variable looks like the following (adjust it with your own installation paths):

C:\Spark\GNU Tools ARM Embedded\bin;C:\Spark\GnuWin32\bin;C:\Spark\DFU Util\win32-mingw32;C:\Spark\Git\bin

Step 3: build the firmware

In order to build your application for the Spark Core you need to rebuild the entire firmware, but don’t get scared by that: you are already doing it when you use the cloud based IDE, even if you are not really aware of it.

To re-build the firmware you need to get its sources which, I know it might sounds weird, are openly and freely available on Github: what you need to do is use the GIT command line tools (or a GUI wrapper like SourceTree) to clone three repositories:

At this point all you need is to build your own firmware by issuing the make command from within the core-firmware/build folder, so get on the command line and do it!

Step 4: install the drivers

Next step was tricky to get right for me, but only because I did it in the wrong order, which means this guide should make it a piece of cake for you.

First let’s install the serial driver to allow debugging of our code. Plug your Spark Core into a USB port and start holding the MODE button (the one on the analog pins side) for a ìbout 3 seconds, until the led starts blinking blue: congrats, you just put your device into Listening Mode.

At this stage you should hear the usual Windows sound telling you a new device has been plugged in: in a few seconds the classic notification windows will tell you a driver is going to be searched and, after a while, the “No driver could be found” warning. It’s time for the Spark Core Serial Driver we downloaded earlier to get into the scene: click on the warning window and point the Windows driver installation procedure to it to get it working!

You should be able to find the Spark Core among the computer serial ports with a nice label saying Spark Core with Wifi and Arduino Compatibility (mine is recognized as COM12, but yours might be different).

This is the driver you are going to use to get access to the device Serial port on USB, but we need to use the USB line for another important task: flash the firmware.

What you need to do is to switch the device into DFU Mode: while connected to USB, press and hold the MODE button, after one second tap the RESET button and realease it while keeping the MODE button pressed. After three seconds or so the RGB led should start fast blinking yellow and which means it gots into DFU Mode and another Windows sounds is played: congrats again!

Get to the command line again and this time execute:

dfu-util -l

You should get something like the following as output, which means your Spark Core is almost ready for flashing:

Copyright 2005-2008 Weston Schmidt, Harald Welte and OpenMoko Inc.
Copyright 2010-2012 Tormod Volden and Stefan Schmidt
This program is Free Software and has ABSOLUTELY NO WARRANTY
Please report bugs to dfu-util@lists.gnumonks.org

Found DFU: [1d50:607f] devnum=0, cfg=1, intf=0, alt=0, name=”UNDEFINED”
Found DFU: [1d50:607f] devnum=0, cfg=1, intf=0, alt=1, name=”UNDEFINED”

It’s time to run the last tool in our tool chain: Zalig. If you still rememeber where you downloaded it take the hassle to double click on it and install the WinUSB driver for the CORE DFU device to allow the flashing procedure to work as expected.

Step 5: flash it!

Well, this is the easiest step, all you have to do is to execute the following on the command line:

dfu-util.exe -d 1d50:607f -a 0 -s 0x08005000:leave -D core-firmware\build\core-firmware.bin

Just be sure you actually are pointing at existing files 😉

Step 6: make it sweeter!

Now that you are capable of flashing the Spark Core without the need of the Cloud IDE I believe you want to be able to use something better than the Windows Notepad to edit and build your projects: my choice is Eclipse so let’s make the build process a little more visual.

For each of the three cloned projects, from within Eclipse CPP

  1. Import > Existing Code as Makefile Project
    import
  2. Select GNU Autotools as Toolchain
    import-2

For the core-firmware project only:

  1. From the project contextual menu Make Target > Build…
    1. Add target “all”
    2. Add target “clean”
  2. From the project contextual menu Properties > C/C++ Build
    1. Within the Builder Settings tab set the Build-Directory to ${workspace_loc:/core-firmware}/build
    2. Within the Environment subsection add a PATH variable pointing at your system environment path
      environment
  3. Project > Build All should make the same binary as from command line
  4. Select Run > External Tools > External Tools Configurations… and create a new configuration to execute the dfu-util command to upload your firmware with a mouse click!
    external-tool
  5. If, like me, you are able to build everything but you get validation errors within Eclipse, add the GNU ARM Toolchain includes to the core-firmware project libraries by adding the arm-none-eabi\include subfolder of the GNU ARM Toolchain to Project > Properties > C/C++ General > Paths and Symbols  > Includes > GNU C++
    paths-symbols

And that’s all folks!