The SparkCore: makers meet the cloud!

Ok, I’ve recently got caught into the makers fever, I admit it.

Now, I’ve recently got two Spark Cores (those little devices are amazing!) I intend to use for a project of mine, but the nRF24L01 library I need has not been ported yet for this little beast.

Hey, this is the open source and open hardware ground: let’s contribute!

Sadly setting up the non cloud based environment on my Windows 7 64bit laptop was not that easy and it took me about an entire saturday morning: what follows is the summary of my experience in a step-by-step guide.

Step 1: get the base software

Start by downloading the following pieces of software and install/uncompress them all but Zadig (it doesn’t need to) and the Spark Core Serial Driver (you can’t until we reach step 4) using an installation folder of your choice, but avoid anything like the Program Files (x86), those brackets and spaces will drive you crazy:

  • GCC ARM ver. 4.8
  • Make 3.81
  • DFU-util ver. 0.7 binary
  • GIT ver. 1.8.5
  • Atlassian SourceTree (Optional)
  • Eclipse CPP Kepler x86_64 (I’m an Eclipse fan)
  • Zadig ver. 2.0.1
  • Spark Core Serial Driver

Please note it would be much better if you install these softwares into a path that doesn’t contain special characters and/or spaces: my advice is to use something like C:\Spark or something like that.

Step 2: set environment variables

To ease development it’s going to be best if some binaries end up into the system path, like make, gcc and dfu-util. My user PATH variable looks like the following (adjust it with your own installation paths):

C:\Spark\GNU Tools ARM Embedded\bin;C:\Spark\GnuWin32\bin;C:\Spark\DFU Util\win32-mingw32;C:\Spark\Git\bin

Step 3: build the firmware

In order to build your application for the Spark Core you need to rebuild the entire firmware, but don’t get scared by that: you are already doing it when you use the cloud based IDE, even if you are not really aware of it.

To re-build the firmware you need to get its sources which, I know it might sounds weird, are openly and freely available on Github: what you need to do is use the GIT command line tools (or a GUI wrapper like SourceTree) to clone three repositories:

At this point all you need is to build your own firmware by issuing the make command from within the core-firmware/build folder, so get on the command line and do it!

Step 4: install the drivers

Next step was tricky to get right for me, but only because I did it in the wrong order, which means this guide should make it a piece of cake for you.

First let’s install the serial driver to allow debugging of our code. Plug your Spark Core into a USB port and start holding the MODE button (the one on the analog pins side) for a ìbout 3 seconds, until the led starts blinking blue: congrats, you just put your device into Listening Mode.

At this stage you should hear the usual Windows sound telling you a new device has been plugged in: in a few seconds the classic notification windows will tell you a driver is going to be searched and, after a while, the “No driver could be found” warning. It’s time for the Spark Core Serial Driver we downloaded earlier to get into the scene: click on the warning window and point the Windows driver installation procedure to it to get it working!

You should be able to find the Spark Core among the computer serial ports with a nice label saying Spark Core with Wifi and Arduino Compatibility (mine is recognized as COM12, but yours might be different).

This is the driver you are going to use to get access to the device Serial port on USB, but we need to use the USB line for another important task: flash the firmware.

What you need to do is to switch the device into DFU Mode: while connected to USB, press and hold the MODE button, after one second tap the RESET button and realease it while keeping the MODE button pressed. After three seconds or so the RGB led should start fast blinking yellow and which means it gots into DFU Mode and another Windows sounds is played: congrats again!

Get to the command line again and this time execute:

dfu-util -l

You should get something like the following as output, which means your Spark Core is almost ready for flashing:

Copyright 2005-2008 Weston Schmidt, Harald Welte and OpenMoko Inc.
Copyright 2010-2012 Tormod Volden and Stefan Schmidt
This program is Free Software and has ABSOLUTELY NO WARRANTY
Please report bugs to

Found DFU: [1d50:607f] devnum=0, cfg=1, intf=0, alt=0, name=”UNDEFINED”
Found DFU: [1d50:607f] devnum=0, cfg=1, intf=0, alt=1, name=”UNDEFINED”

It’s time to run the last tool in our tool chain: Zalig. If you still rememeber where you downloaded it take the hassle to double click on it and install the WinUSB driver for the CORE DFU device to allow the flashing procedure to work as expected.

Step 5: flash it!

Well, this is the easiest step, all you have to do is to execute the following on the command line:

dfu-util.exe -d 1d50:607f -a 0 -s 0x08005000:leave -D core-firmware\build\core-firmware.bin

Just be sure you actually are pointing at existing files 😉

Step 6: make it sweeter!

Now that you are capable of flashing the Spark Core without the need of the Cloud IDE I believe you want to be able to use something better than the Windows Notepad to edit and build your projects: my choice is Eclipse so let’s make the build process a little more visual.

For each of the three cloned projects, from within Eclipse CPP

  1. Import > Existing Code as Makefile Project
  2. Select GNU Autotools as Toolchain

For the core-firmware project only:

  1. From the project contextual menu Make Target > Build…
    1. Add target “all”
    2. Add target “clean”
  2. From the project contextual menu Properties > C/C++ Build
    1. Within the Builder Settings tab set the Build-Directory to ${workspace_loc:/core-firmware}/build
    2. Within the Environment subsection add a PATH variable pointing at your system environment path
  3. Project > Build All should make the same binary as from command line
  4. Select Run > External Tools > External Tools Configurations… and create a new configuration to execute the dfu-util command to upload your firmware with a mouse click!
  5. If, like me, you are able to build everything but you get validation errors within Eclipse, add the GNU ARM Toolchain includes to the core-firmware project libraries by adding the arm-none-eabi\include subfolder of the GNU ARM Toolchain to Project > Properties > C/C++ General > Paths and Symbols  > Includes > GNU C++

And that’s all folks!


Smarter Eclipse quality friendly config

If you haven’t realized I’ve some sort of addiction to software quality then this should be the first time you read my blog: it doesn’t mind because you are reading it now!

Here is another of my famous (!!!) tips for a better Eclipse IDE configuration and this time I’m trying to help all those software developers out there that, willing or not, smash their faces against tests being them unit or automated ones. If you don’t write such tests don’t get desperate: this tip might still help you!

With introduction of static imports more and more libraries have converted or integrated their APIs with commodoty static methods, with testing libraries being one of the most populated category.

If you use JUnit, Mockito and/or any other library using static methods for building complex object structures you will certainly know the code you write should NOT look like the following:

  public void someTestMethod() {
    Mockito.when(mockedObject.someMethod(Matchers.any(String.class))).thenReturn(new Object());
    // some code here
    Assert.assertEquals(expectedValue, mockedObject.someMethod());

Good modern code should look like the following instead:

  public void someTestMethod() {
    when(mockedObject.someMethod(any(String.class))).thenReturn(new Object());
    // some code here
    assertEquals(expectedValue, mockedObject.someMethod());

Our favourite Eclipse IDE can be instructed to help you write those neat lines of code, actually recognizing you are using static methods and automatically add the corresponding static imports.

This not widely known preference is available in the form of a configurable list of packages and types candidate for static methods and attributes scanning, reachable at Window > Preferences > Java > Editor > Content Assist > Favorites (yes, not the easiest to find, I agree).

After configuring your favourite libraries in that list you can start forgetting about the Assert, Mockito and Matchers classes: I just start typing the method name and the IDE does all the borying stuff for me!

This is what I’ve configured at the moment in my STS


Eclipse: annoying JSP errors

A few weeks ago I’ve posted something about annoying Eclipse validation errors regarding minified JavaScript files. Today I’m here to solve the same issue with regards to JSP validations, specifically when those errors are due to non project related files.

I’ve just setup the Maven Cargo Plugin for my project and suddenly I got tons of errors due to JSP files within Tomcat 7 distribution not satisfying JSP validation rules: I hate those distracting errors/warnings and I decided to apply the same solution I used for JavaScript files.


It’s quite easy, but it has to be done on a per project basis: right click on your project then select Properties and in the upcoming pop up window select Validation then click on the ellipses (…) button for JSP Content Validator (you’ll have to repeat the same for the JSP Syntax Validator).

In the new pop up window that will be displayed you will have to add an Exclude Group and then add a rule for the target folder.


Run a clean build to free up your project from unecessary validation erros!

Annoying JavaScript Eclipse Errors

I hated the way Eclipse handled source code validation, especially regarding those minified JavaScript files (read jQuery): you always end up disabling a specific validation entirely to remove that red cross that makes your project look weird.

Now, after a little googling, I’ve found THE solution!

I will not annoy you stepping into the details on how it works, so just open your Eclipse favourite IDE, right click on the project(s) containing the offending files and select Properties > JavaScript > Include Patgh > Source then add one or more exclusions.



If you, like me, struggled for long time on this, feel free to share this post.