Quantcast
Channel: Evothings | RSS Feed
Viewing all 97 articles
Browse latest View live

How to control the MediaTek Linkit Connect 7681 using a DIY mobile application

$
0
0

During the last year, several chip vendors have released low-cost WiFi boards to power the Internet of Things revolution. MediaTek’s take on this is the contribution of newly released MediaTek Linkit Connect 7681. In this tutorial we will show how you can control a GPIO of the board using a mobile application running off your smartphone, which you can customise and adapt to suit your own IoT innovations and projects.

mediatek-linkit-connect-7681

The MediaTek LinkIt 7681 has been developed by MediaTek. The development board is based on the MediaTek MT7681 chipset which has been designed to help makers, hackers and constructors to connect anything to the internet. The board can either connect via an access point or act as one. The board also has five general input/output (GPIOs, numbered 0..4) and one universal asynchronous receiver/transmitter (UART) port. Compared to the MediaTek LinkIt ONE which we wrote about earlier, the board is a much simplfied architecture in terms of hardware. The intended use is to either build really simple connected projects or to extend an existing project with means of connectivity.

If you are curious to learn more about the LinkIt Connect 7681, there is a great developers guide from MediaTek that covers what you need to know in order to start working with the hardware.

Source Code

You can browse the source code for this tutorial in our designated Evothings GitHub repository.

What you need

To work through this tutorial; you will need the following:

  • The MediaTek LinkIt Connect 7681.
  • An iOS or Android smartphone.
  • A computer running Microsoft Windows or Ubuntu Linux.

Step 1 – Hardware requirements

This tutorial requires no external hardware since each GPIO has a LED connected to it.

Step 2 – Embedded software

Preparation

If you already have the MediaTek LinkIt Connect 7681 SDK installed on your computer, you can skip this step.

Before you can start compiling and downloading any software to your board you need to provide your computer with the development environment. As of today there is support for Windows and Ubuntu (Linux). MediaTek provides instructions on how to configure the development environment in their developers guide which you can find on their website. From this point on in the tutorial, we assume that you have already installed the development environment and assume that you know how to open a serial connection to the board.

Got a Mac? This tutorial was written running Ubuntu 14.04 LTS within VirtualBox on a MacOS X computer, so there’s always a way.

Application Design

Enough with preparations, let’s get down to business. We are now going to configure the development board to connect to an existing WiFi network and start to listen for TCP connections on port 4538. When a client makes a connection, the software will send the current status of the five GPIOs and then idles awaiting new commands from the connected client.

In order to accomplish this we need a simple protocol to control the LEDs. In this tutorial we’ll define a simple protocol on the form: $XY#, where X (0-4) is the GPIO we want to change and Y (1/0) is what binary state we want to change it to, i.e. a 1 will turn on the LED and a 0 will turn it off. The development board will then answer and acknowledge that the changes has been made by sending a response that will be the same as the request.

If you for example would like to turn the first LED on, the request would be $01# since the first LED is connected to the GPIO 0. Then we would expect the board to answer with $01# thus verifying that the LED has been turned on.

Create a project

First step in this part of the tutorial is to create an empty project.

You do this by copying the src/ directory from the SDK to a path where you would like to store your project. Due to legal rights reasons, we are unable to provide the entire source code to this example. The source code is instead provided using a patch file. This file is created simply by comparing our project directory with the empty project in the src/ directory, the difference between the directories is then saved to a patch-file.

In order to apply the patch to your new project you need the patch application. This is usually provided with Linux distributions. If you are using Windows, you’ll need to install the patch in some way, e.g. during the installation of the MediaTek LinkIt Connect 7681 SDK you’re asked to install Cygwin, this software has in turn a bundled patch application that you are recommended to install.

The patch itself can be downloaded from our Github repository.

Start Cygwin or open a terminal and browse to the directory containing your project. Also ensure that you know the path to the Evothings-MediaTek-Connect-7681-TCP-Led.patch provided by Evothings. Execute the following command:

patch -p1 < path/to/Evothings-MediaTek-Connect-7681-TCP-Led.patch

If the command executes without any issues you should see the following printout:

patching file cust/iot_custom.c
patching file cust/tcpip/iot_tcp_app.c
patching file mak/MT7681/flags_sta.mk

Before you try to build and upload the firmware you’ll need to add your WiFi credentials in the source code. The following section contains instructions on how to do that.

Configure WiFi

The first step is to add the credentials of your WiFi to the source code.

Open the file cust/iot_custom.c and find the function IoT_Cust_SM_Smnt(). Change the Ssid variable to the name of your network and the Passphrase to the password of that network. When we developed this tutorial we faced an issue that prevented us from actually changing what WiFi we could connect to. We found that sending the command “AT#Default” (without quotes) to the board using the serial connection solved our issue – keep that in mind if you have any issues with the WiFi connection. We sent the command to the board each time we uploaded new firmware to the board that changed either the Ssid or Passphrase variable and faced no issues.

In the same file we have to ensure that the board listens for connections on port 4538. This is simply done by adding the function call uip_listen(HTONS(4538)) to the function iot_cust_init() – which is called once after the module initializations has finished.

Open the file cust/tcpip/iot_tcp_app.c in an text editor and then locate the function iot_tcp_appcall(). This function is executed each time a TCP event occurs, in this tutorial we will detect when a connection is made on the port 4538 and execute the function app_handle_request(). This function is the base for our implementation. When a new client is connected the following code is executed.

    if(uip_connected()) {
        u8_t buffer[sizeof(valid_pins) * RESPONSE_LENGTH];
        u8_t buffer_length;
        app_create_response(buffer, &buffer_length);
        uip_send(buffer,buffer_length);
    }

This code verifies the connection and responds to the connection by sending the current status of the LEDs on the board. When connecting to a board that currently has LED 1 and 3 turned on the data sent will be the following:

$00#
$11#
$20#
$31#
$40#

When a client is connected and sends new data, the data is received and parsed using the app_is_request_valid() function. If the message received is valid, it is acted upon. In this case that means that an LED is either turned on or off after which an acknowledgment is sent back to the client. Non-valid messages is simply ignored. The code that accomplishes this can bee seen below.

    else if(uip_newdata()) {
        u16_t index = 0;

        while(index < uip_datalen()) {

            u8_t *request = (u8_t *)uip_appdata + index;

            if(app_is_request_valid(request)) {

                u8_t pin = request[REQUEST_PIN_OFFSET] - '0';
                u8_t pin_value = request[REQUEST_PIN_VALUE_OFFSET] - '0';
                iot_gpio_output(pin, !pin_value);
                u8_t polarity;
                u8_t response[RESPONSE_LENGTH];
                iot_gpio_read((int32)pin, &pin_value, &polarity);
                write_response_to_buffer(response, (u8_t)pin, pin_value);
                uip_send(response, sizeof(response));
                index += REQUEST_LENGTH;
            }
            else {
                index++;
            }
        }
    }

Now we are ready to try to build and upload the application. MediaTek provides instructions on how to do that in their Developer’s guide (see links above).

Step 3 – Mobile application

Preparation

If you already have installed a working copy of Evothings Studio you can skip this step.

First you have to install the Evothings Studio on your computer. The Evothings Studio contains two applications that interact with each other. You have to install Evothings Workbench on you computer. The software provides you with the interface in which you will perform your development. You’ll also need to install the Evothings Client (iOS, Android) on your smartphone. When started, the client can scan for a Workbench environment and then open a connection dialog – once open, simply pressing RUN for any project on the PC will push the corresponding code the mobile application — just like that.

Source code

The application source consists of three files; an index.html file which contains the user interface, an app.css which contains the style sheet information and finally the app.js which contains the application logic. In this tutorial, we’ll also make use chrome.tcp.sockets in order to open a socket to our MediaTek LinkIt Connect 7681 board. The socket is then used to send requests and to receive responses from the board according to the protocol that we defined earlier in this tutorial. And yes, the Chromium Socket low-level code is already compiled into the Evothings Client in beforehand, ready to run straight from JavaScript.

mediatek-linkit-connect-7681-controlView

The user interface is built on three views using the jQuery Mobile framework; a startView, connectingView and controlView which can be seen above. The startView is displayed to the user as the first view that the user faces when running the application. It consists of a text input field where the user inputs the IP address of the development board and a connect button. The connectingView is displayed to the user while the application tries to open a TCP socket and finally the controlView that contains five circles, each representing a LED on the board and a disconnect button. Simple as pie, yet a worthy starting point hopefully challenging you to make great apps.

The following code is executed once the application is fully loaded and the code defines event handlers for the connectButton and disconnectButton.

$(document).ready(function() {

    $('#connectButton').click(function() {
        app.connect()
    })

    $('#disconnectButton').click(function() {
        app.disconnect()
    })
})

As you can see in the code above the app.connect() method is executed when the connectButton is pressed. The application first hides the startView and displays the connectingView. Then it tries to open a socket to the ip address the user put in IPAddress input using the chrome.sockets.tcp API. If it succeeds in opening a socket the it hides the connectingView and displays the controlView and then registers a callback to execute when data is received. If it fails to open a socket an error message is displayed to the user and then the connectingView is hidden and startView is displayed again. In the code displayed below you can also see the callback that handles incoming data, it basically converts the data to a string and then executes the app.parseReceivedData() method.

app.connect = function() {

  var IPAddress = $('#IPAddress').val()

  console.log('Trying to connect to ' + IPAddress)

  $('#startView').hide()
  $('#connectingStatus').text('Connecting to ' + IPAddress)
  $('#connectingView').show()

  chrome.sockets.tcp.create(function(createInfo) {

    app.socketId = createInfo.socketId

    chrome.sockets.tcp.connect(
    createInfo.socketId,
    IPAddress,
    app.PORT,
    connectedCallback)
  })

  function connectedCallback(result) {

    if (result === 0) {

       console.log('Connected to ' + app.IPAdress)

       $('#connectingView').hide()
       $('#controlView').show()

       chrome.sockets.tcp.onReceive.addListener(function(info) {
         var data = app.bufferToString(info.data)
         console.log('Received: ' + data)
         app.parseReceivedData(data)
       })
    }
    else {

      var errorMessage = 'Failed to connect to ' + app.IPAdress
      console.log(errorMessage)
      navigator.notification.alert(errorMessage, function() {})

       $('#connectingView').hide()
       $('#startView').show()
    }
  }
}

The method app.parseReceivedData() handles the incoming handles the incoming data. In this case the method looks through the string after a part of the string that matches the pattern we defined in our protocol and then each match is sent to the method app.handleVerifiedResponse() as you can see in the code snippet below.

app.parseReceivedData = function (data) {

  var regExpPattern = /\$[0-4][0-1]#/gm

  data.match(regExpPattern).forEach(function(element,index, array) {

        app.handleVerifiedResponse(element)

  })
}

The method app.handleVerifiedResponse() contains a bit of jQuery magic. The first thing the method does is to extract the GPIO pin and the value of that pin from the response. Since the response already has been verified by the app.parseReceivedData() method no further checking is done at this point.

Each LED is represented by a <div> element belonging to the class circleBase, this class ensures that the <div> is drawn as a circle. Each <div> also has an unique id on the form ledX where X is a LED from 0 – 4. This id is then used to find the <div> representing each LED. Once the <div> is identified the color off the <div> is set to a color that corresponds to the color of the physical LED if the response states that the LED has been lit, else the color of the <div> is set to grey. This is done by adding and removing the class ledOff and ledX where X is the LED number. The different colors is defined in app.css. The code also ensures that the click event is changed to either app.ledOn() or app.ledOff() also depending on if the LED was turned on or off. The code that implements this functionality can be seen below.

app.handleVerifiedResponse = function(response) {

  var pin = response.charAt(1)
  var pin_value = parseInt(response.charAt(2))

  var domId = '#led' + pin

  if($(domId).length == 0) {

  var htmlString = '<div style="display:inline-block">' +
           '<div id="led' + pin +'" class="circleBase ledOff"></div>' +
           '<p class="center">' + pin + '</p></div>'

  $('#ledView').append($(htmlString))
  }

  if(pin_value == 1) {

    $(domId).removeClass('ledOff').addClass('ledOn')
    $(domId).unbind('click').click(function(){
      app.ledOff(domId)
     })

  }
  else {

    $(domId).removeClass('ledOn').addClass('ledOff')
    $(domId).unbind('click').click(function(){
      app.ledOn(domId)
    })
  }
}

The methods app.ledOn() and app.ledOff() is both basically wrappers that embeds a request into a string that follows the defined protocol. Both methods then call the app.sendString() which also acts as a wrapper for the chrome.sockets.tcp.send() method, the implementation of app.sendString() can be seen below.

app.sendString = function(sendString) {

console.log('Trying to send:' + sendString)

chrome.sockets.tcp.send(
  app.socketId,
  app.stringToBuffer(sendString),
  function(sendInfo) {

    if(sendInfo.resultCode < 0) {

      var errorMessage = 'Failed to send data'

      console.log(errorMessage)
      navigator.notification.alert(errorMessage, function() {})
    }
  })
}

In the code, there are two helper functions, app.stringToBuffer() and app.bufferToString() which encodes the string to be sent and decodes the received data.

Finally there is a app.disconnect() method that executes the chrome.sockets.tcp.disconnect() and hides the controlView and displays the startView. The method is called when the disconnectButton is pressed.

Summary

In this tutorial, we have demonstrated how easily you can write a cross-platform mobile application that communicates with your MediaTek LinkIt Connect 7681 using TCP sockets. The mobile application was developed using the Evothings Studio. The application should provide a great starting point for any connected project related to the MediaTek LinkIt Connect 7681.

We would love to see what you build using these products, please share your projects with @evothings.

Happy tinkering!


Using Google Charts in your app to visualise data from Parse IoT cloud

$
0
0

parse_temp_hammadIn this tutorial you’ll learn to retrieve the stored TI SensorTag data and visualize it using a Google Charts widget inside the webcontainer of an Evothings app. You’ll be using the Parse IoT cloud service and Evothings Studio. The live data source is a Texas Instruments SensorTag, tethered to a modern smartphone via Bluetooth Smart.

Facebook Parse cloud is designated Mobile-Backend-as-a-service, simply put, it’s designed for mobile app developers who want to obtain app persistence and store mobile backend data in the form of a cloud service. Previously, we’ve used Parse cloud to store temperature data generated by a Texas Instrument SensorTag. For IoT apps, chances are that we also need to retrieve the stored data and show it to the user.

Important: In order to follow this tutorial, it’s favorable that you have completed all steps listed in How to make a mobile IoT app for the Facebook Parse data cloud so that you have a Parse account, and already have seen basic incoming data gathered coming from your sensor device or mock-up pseudo device.

Introducing Evothings Studio and the Google Charts Library

Evothings Studio is an development environment for creating mobile apps for IoT. There is a Workbench application for your computer, and clients available on the public app stores. The Evothings Client runs on your phone or tablet, is a hybrid application in both iOS and Android flavors, you program it using HTML5/Javascript, and since it uses web technologies for coding you can render the common modern client-side JavaScript libraries. This provides you as a developer an easy way to rapidly develop and prototype your mobile IoT app for both Android and iOS in one go.

Google Charts is one of the most widely used real-time data visualization libraries, it’s fairly easy to use, has a wide variety of ready-to-use charts and widgets for you to embed in your HTML code. You can also customize the chosen charts using a range of parameters provided by the library to adapt to the look-n-feel you want.

For the purpose of this tutorial, we are going to use an analog-style “Gauge Chart” as it has an old-school touch to it, and relevant to temperature data visualization.

Download the example app code

To dive in quickly, get your copy of the example app code, with all the project files. Your downloaded app code can be stored anywhere you like, put to your favourite workspace folder.

Update the app code with your Parse app credentials

Open up the index.html file in your favorite text editor. In this file, you’ll find an “initializeParse()” function, update the “Parse Application ID” and “JavaScript Key” with the values you retrieved while creating your Parse cloud app in the first part of this tutorial.

Running the example

  1. Launch Evothings Workbench
  2. Drag-drop the “index.html” file into the Workbench, which will create a new project entry on top of the project list
  3. Connect to the Workbench from inside the “Evothings Client” on your phone using your WiFi IP address or Workbench discovery tool
  4. Click the “Run” button for the Parse IoT Demo with Google Charts
  5. Activate the TI SensorTag and press “Start reading SensorTag” button just as we did before (if you don’t have the sensor hardware, just use the demo method given in previous tutorial to see the output)

Now the app will start notifying you that it’s storing the data upstream to your Parse cloud app and at the same time it will start downloading the latest uploaded temperature record from Parse cloud, which in turn updates our Gauge chart in real-time!

Code explanation

Find the function drawChart() in index.html, this the main function which manages three tasks; drawing the chart, downloading the last temperature record from the Parse cloud and then updating the chart with the record retrieved.

The following code block sets the chart customization options, selects the target HTML element with the ID chart_div and draws the Gauge chart in that element:

// Initial value for the chart
   var data = google.visualization.arrayToDataTable([
      ['Label', 'Value'],['Temp', 20]
   ]);

// config options for the chart
   var options = {
      width: 400, height: 120,
      redFrom: 25, redTo: 30, yellowFrom:22, yellowTo: 25,
      minorTicks: 1, max: 30
   };
   var chart = new google.visualization.Gauge(document.getElementById('chart_div'));
   chart.draw(data, options);

Once we have the chart in place, we query the Parse cloud after every 2500ms in order to download the latest record and then we redraw the chart:

setInterval(function() {
// Retrieve the last temperature entry from Parse cloud
   var ReadingObj = Parse.Object.extend("SensorTagReading")
   var query = new Parse.Query(ReadingObj)
// Retrieve the most recent object
   query.descending("createdAt")
   query.find({
      success: function(readingObj) {
         temp = readingObj[0].get("Temperature")
      }
   });

// Pass the temperature value to the chart and redrawing it
   data.setValue(0, 1, temp)
   chart.draw(data, options)
}, 2500);

Now you have gone full circle of both storing and retrieving data from the Parse cloud, as well as how to use the Google Charts library for creating engaging User Interfaces for your IoT app. As next step, explore other examples on our website or head over to Evothings Forum to engage with the Evothings IoT community.

Motion Sensor SMS Alert App for nerds and bikers

$
0
0

bikes_smallA smartphone is packed with sensors, and can continue to serve well for several years after it has stopped being as your personal phone. In this use-case you place a second device, other than your primary smartphone, on your bike under the (locked) saddle compartment. Then if out of bravery or foolishness — despite the on-tank airbrushed skull and crossbones — someone would dare to move or even budge the bike, the phone detects this motion and instantly sends you an alert via SMS. There are tons of other use cases, but as several of us have bikes this one could be very practical!

A while back, I wrote a general tutorial on instructables.com in which I outlined instructions on how to use a smartphone as Motion Sensor Alarm, again putting your outdated or second Android phone to use, letting the occured motion event turn a table lamp on/off when the motion was detected. So let’s take this theme a bit further, and as for the use-case I just mentioned above to invoke an SMS to a preset number typically your other phone. That will allow you to receive alerts even when you are not in close proximity of your belongings, or even to a list of multiple receivers. Your whole gang can all come and teach that motorcycle handling menace a lesson!

Please note: Even as Evothings Studio is a multi-OS appmaking system, this tutorial can only be used when developing for Android, simply because on iOS (and Windows for mobile devices, an SMS can only be invoked using the native SMS dialog, which in turn means we can’t send automated SMS alerts on those operating systems without manually pressing “ok” or “send” via an on-screen dialog.

Still sounds interesting? Great, let’s start!

Step 1: Get Evothings Studio

We will be using Evothings Studio to rapidly develop an app in HTML5 and JavaScript. To get started (about 5 minutes):

  1. Download Evothings Workbench
  2. Download Evothings Client for Android
  3. Connect the Evothings Client app with Evothings Workbench using the app’s Workbench scanning function or by punching in your computer’s IP address.

More about Evothings Studio

  • Evothings Studio an open source tool which lets you develop mobile apps within minutes using HTML5 and JavaScript.
  • You can start building your app right after downloading – no lengthy SDK installations.
  • It saves you from the complexity of learning native app development languages and SDKs.
  • Evothings Client is openly released under the Apache 2 licence.
  • An Evothings Client is based on the Apache Cordova app framework, meaning that functionality available for Cordova (and hence for its commercial incarnation Phonegap), it will also run under Evothings Studio and on the Evothings Client.
  • Evothings Client comes already bundled with selected Cordova plugins, allowing you to start prototyping your IoT apps straight away.
  • When you need extra Cordova plugins – like in this case where we want to send an SMS – just clone Evothings Client’s Git repository and build a custom Evothings Client that looks and behaves exactly like you want, which is covered in this recent tutorial.

Step 2: Build a custom Evothings Client with Cordova SMS Plugin

As mentioned in the previous step, Evothings Client comes with selected plugins which you may need to develop an IoT app, however, the official Evothings Client does not come bundled with a plugin which facilitates sending an SMS directly from the app. In this case, as the Evothings Client is open-source, we just need to clone a copy from its Git repository and build our own custom Evothings Client that includes the ​Cordova SMS Plugin.

Sounds complex? Don’t worry, there is a very detailed tutorial available explaining how to add the Cordova SMS plugin (or any other plugin) to Evothings Client. Follow the given tutorial step-by-step, you will have your custom Evothings Client app with SMS sending capability in no time!

Let’s say that you’ve created your custom Evothings Client, so we can take the final step and develop our app!

Step 3: Developing Motion Sensing SMS Alert App

We will need to have two functionality blocks, as we need to:

  • Detect motion using the mobile device’s accelerometer.
  • Send an alert SMS.

For ease of understanding, download this example code from my Git repository and unzip it.
After downloading:

  1. Launch Evothings Workbench, if you haven’t already
  2. Launch your recently built customized Evothings Client on your phone.
  3. Connect Evothings Client with Evothings Workbench.
  4. Use your mouse to drag-drop the index.html from the downloaded example’s directory straight into Evothings Workbench. This creates a new top row in the Workbench with your project
  5. Click Run!

You should be able to see the example running in the Evothings Client app. Press the “Wait for movement” button, that will change the status text to “Waiting for motion…”. Now move your phone slightly, the status text will change to “Motion Detected” and after a second you will see the alert box notifying that a text message has been sent. At the moment the message will be sent to nobody, as we have not yet filled in our phone number in the code, so please read further. You can also continue to tweak and further develop you app anyway you like!

sms_motion_dumps_small

Step 4: Code Explanation

For detecting the motion, we are using Cordova’s own Accelerometer API, which is already built in. Open index.html in your favourite code editor and analyze the following code block:

// Start watching the mobile device's movement (acceleration).
	app.waitForMotion = function()
	{
		/* Start watching the device's acceleration to determine if it has
		 * moved, unless we're already watching the device's movement. */
		if (!app.watchId)
			app.watchID = navigator.accelerometer.watchAcceleration(
				app.onAccelerationReceived,
				app.onAccelerationError,
				{ frequency: app.movement_check_interval }
			)

		app.setStatusText('Waiting for motion...')
	}

First, we ask the Accelerometer object to start watching the accelerometer for any changes in the current state. Once a change is detected, the app.onAccelerationReceived function is called; this function compares the current readings with previous ones and if there is a positive difference between the current and previous reading of accelerometer along x-axis or y-axis, the app.sendSMS function is invoked in order to send an SMS through the Cordova SMS plugin.

/* onAccelerationReceived:
	 * Called when an accelerometer value has been received from the mobile
	 * device. Determines whether the device has moved since the last check. */
	app.onAccelerationReceived = function(acceleration)
	{
		// Round the accelerometer readings to avoid detecting small vibrations.
		var xMotion = Math.round(acceleration.x, 4)
		var yMotion = Math.round(acceleration.y, 4)

		/* Compare the current readings with previous ones. If they differ 
		 * we conclude that the device has moved. If no previous readings were
		 * collected, compare with 0. */
		if (xMotion != (app.previous_xMotion || 0) ||
			yMotion != (app.previous_yMotion || 0))
		{
			app.setStatusText('Motion detected')

			// Send alert SMS.
			app.sendSMS()

			/* Stop checking for device movement to avoid sending multiple SMS.
			 * TODO: re-enable the movement checking after e.g. 1 minute. */
			app.stopWaitForMotion()

			// Store the readings for use in the comparison the next time.
			app.previous_xMotion = xMotion
			app.previous_yMotion = yMotion
		}
	}

You will need to change the number property of “app” object with your own phone number and also change the message variable with your own custom alert text, for example, “Bike is moving .. Alert! Alert! Alert!” (You can probably be more creative here :-).

// Application object.
	var app = {
		/* Phone number through which the message will be sent upon movement.
		 * Replace with your own phone number. */
		PHONE_NUMBER : '0xxxxxxxxxx',
		// Message sent to the recipient's phone upon device movement.
		SMS_MESSAGE : 'Alert! Motion has been detected',
		/* Interval at which to check for device movement, in milliseconds.
		 * Lower values means we are able to detect smaller movements, and
		 * higher values means less battery consumption.
		 */
		movement_check_interval : 3000
	}

That’s it! You have successfully developed your Motion Sensing SMS Alert app, they’ll never be messing with your fine wheels again without you knowing!

Pushing sensor data onto AWS from scratch in 15 minutes

$
0
0

In this exciting tutorial, you’ll learn how to create a mobile app in JavaScript using Evothings Studio, that sends data from a sensor device to the cloud and back home using Amazon’s AWS Lambda and DyanmoDB.

By clever use of scripts and dynamic web tools, both for configuring the cloud services and for hooking up a Bluetooth-enabled sensor, live data is sent cloudside using your phone as a gateway, and back again from the cloud to show the result, and you easily control any aspect of how it looks and works – all the action here is created using HTML5 and JavaScript.

So before you start your kitchen egg timer just to challenge the article’s headline, you’d want to have an AWS account setup handy with username/password – any free or or paid account doesn’t matter. And this set-up is just an example, even if it’s a pretty cool one.

Our demo app continously reads temperature data from a Texas Instruments SensorTag [ti.com/sensortag] Bluetooth enabled device and passes the data to AWS using a phone as a gateway. If you don’t have access to a SensorTag you can still stay with the action; you can use the demo app for reading and writing simulated sensor data instead, and just add a physical device of your own later on. In addition, you could experiment using some other brand of BLE-enabled sensor, though I’ll willingly admit that going rogue with other device models will require some additional programming, and this tutorial is bespoke for the Bluetooth SimpleLink SensorTags provided by Texas Instruments, with the extreme short time limit in mind.

So here we go, game on!

Setting up AWS (estimate time for this step; 5 min)

Start by following this cloud-side wizard [ amzn.to/1cVzMZN ] in your computer’s web browser to set up your AWS services (i.e. IAM to create auser, Llambda to run code on incoming events and DynamoDB to store and retrieve data). Details on completing this step are found below, most fields and checkboxes are pre-configured, which basically is what the wizard does for us. It’s quite safe to invoke these entry-point services, and the cumulative amount of data originating from the sensor tag usage in this example will amount to any visible cost is minimal, if so typically in the order of less than $1 per day, which I myself haven’t seen any traces of all Summer long using my free tier AWS account

Many AWS services, like Lambda and DynamoDB are available for limited use without cost. Follow this link for the details on creating a free account: aws.amazon.com/free/

Select Template. The wizard helps you define a software stack (called a “AWS CloudFormation” in Amazon lingo), which is a collection of resources you’ll need. The default stack name is “IOT2015”. There is also a preconfigured AWS S3 template named “ioe2015.s3.amazonaws.com/iot.template” which automates most of the setup. In this screen, you can also download this template if you want to see what it does. On this screen you don’t need to change anything, so just press Next on your lower right to continue.

Screen Shot 2015-09-09 at 17.19.16

Specify Parameters, where the Lambda parameters are defined. (AWS Lambda is a simple yet efficient event listener and queuing mechanism which is used for gathering sensor data.) Use the default bucket name “ioe2015” and the Lambda package S3 key “iotapi.zip”. Press Next.

Screen Shot 2015-09-09 at 17.19.25

Next screen is Options Leave the fields blank, and just press Next

Screen Shot 2015-09-09 at 17.19.33

Review screen summarises what’s going on in the setup;

  • Creation of the Lambda function for triggering code on incoming events
  • A DynamoDB database table for storing values
  • A new user in the IAM service (Identity and Access Management), with access only to (1) and (2), and not anything else you’re running on AWS.
  • Check the checkbox to allow creation of the user and permissions and press Create to move on.

    Now the service setup on AWS is complete (no need to create more Stacks or Services, even if the landing screen arguably tempts you to do so). Next step is to find the proper key and value names used for configuring the mobile app to connect to your newly created AWS data store.

    When the CloudFormation stack creation is complete – select the created IOE2015 stack and click the Outputs tab. There you can see the configuration values that you should enter in the configuration of the Evothings application. Have these values ready to enter in the steps below.

    output_tab

    You’ve now all done with Amazon AWS set up – the three basic services – and are good to go for the next step, hang on, we’re going mobile!

    Evothings Studio (2:30 min)

    Download and install the Evothings Studio workbench software from [evothings.com/download] and the corresponding Evothings Viewer app to your iPhone or Android least to type by searching for “evothings” via the public app stores. The Alpha release of Evothings Studio 2.0 will also work fine. The Evothings Client app (Evothings Viewer app if you’re using the 2.0 Alpha) will help you to preview your app in realtime while developing it, without compiling each time.

    Connect your phone to the workbench by generating a connect code on the computer, a short string of letters and numbers, to punch it into the Evothings Viewer to pair your phone with Evothings Studio, and run the Hello World from the Workbench window just for good measure.

    The AWS-IoT project (2:30 min)

    Download the EvoAWS example project [bit.ly/EvoAWS] and unzip it. Open the “aws-config.js” file and enter your AWS credentials for the IOT2015 user, for example:


    evothings.aws.config =
    { //generate access keys under IAM
    accessKeyId: 'ASDFASDFASDFADSF',
    secretAccessKey: 'asdASDFOIasdfopadsfpoiadfSOIadsfAPODSFi23423',

    // find FunctionName and region under Lambda
    region: 'eu-west-1',
    params: { FunctionName: 'IOE2015-IoTAPI-XXFLIHSDLFHLDF'}
    }

    Press Save.

    Drag-n-drop the main index.html file into the Evothings Studio workbench window and press run. The app now loads at once on your phone.

    Test operations by pressing the top button READ/WRITE DEMO in the app. Write some random data by pressing WRITE TEMPERATURE VALUE. Now read the data just written by pressing READ TEMPERATURE VALUE. This shows that your AWS Lambda service is working correctly.

    Navigate back to the app main menu and select the third menu option on-screen, the SENSORTAG DEMO example.

    Connect the TI SensorTag (30 seconds)

    The TI SensorTag has a pushbutton on the side, and long pressing it sets the device in announcement mode, whereby a tiny green light flashes continuously. Press button START READING SENSORTAG in the app, and after a few seconds the data will flow!

    But wait, I look again at my egg timer, and this is only a grand total of ten and a half minutes… Well, I know and there is a plan for this case; we suggest you use our remaining 270 seconds to go ahead and modify the “index.html” file. Every time you press save, all changes you’ve made will reflect automagically on your connected client app. Or why not alter the behaviour by changing the temperatureInterval value to 5000 (ms) on line 64 in the file aws-app-sensortag.html. Press save, and you’ll see temperature data coming in more frequently, without recompilng, re-signing or waiting!

    The tutorial is over, but the challenge remains like they’d say in a cheesy Hong Kong movie. It’s your app now, so feel free to change pretty much any aspect of it you’d like using your HTML5 and JavaScript skills, no compiling, no signing, no waiting either, just like developing IoT service should be! Why not add some steampunk analog gauges from the Google Charts arsenal like in another tutorial we published last Summer.

    If you’ve read all the way to here without hacking, then you really should get to it, and try it out – it’s fun and easy to create mobile IoT services, once you know how it’s done. Don’t forget to drop by the forum to show off what you’ve created,, reach out and share code and thoughts, or ask questions to a growing developer community around cloud, mobile, IoT and web technologies.

    More on AWS and how to get started: aws.amazon.com/
    AWS programme for startups: aws.amazon.com/activate/
    AWS Free Tier: aws.amazon.com/free/

    More tutorials and useful code for your mobile IoT needs at evothings.com/developer

Develop your IoT app using Ionic Framework and Evothings Studio

$
0
0

ionic_logo

In this tutorial you will learn about using Ionic Framework, and how to use Evothings Studio for rapid development of IoT apps that run on modern smartphones. When we develop HTML5-based mobile apps, we are poised to roll our own code or more commonly choose between a number HTML frameworks and libraries, high-level libraries foremost to easier develop mobile-optimized user interfaces and native­-like functionality.

The Ionic framework is one of the most popular development frameworks out there; it’s open­ sourced and combines many modern web technologies; it’s built with Sass and optimized for AngularJS, it also offers both a minimal DOM manipulation tools and hardware accelerated transitions, the result is a robust framework which also provides many CSS and JavaScript components to build native like mobile apps.

Evothings Studio is as you probably know by now, built to rapidly create (and prototype) IoT apps. It allows you to develop and run a basic app in under 5 minutes time. It’s easy to use and saves you from doing lengthy package installations. When we combine the powerful SDK of Ionic Framework with Evothings Studio, we get an environment which is seriously built for developing performance-focused mobile apps.

Using Ionic Framework with Evothings Studio

Ionic Framework provides its own CLI method to get started with HTML5 app development. However, as we are going to use it with Evothings Studio, we will need to start by downloading (or cloning) the Ionic App Base from its Github repository. Once downloaded, just save the app to your favourite work directory on your computer and follow these basic steps:

  1. Download Evothings Workbench (if you haven’t already)
  2. Download Evothings Client app (from Android, iOS app stores)
  3. Connect Evothings Workbench with Evothings Client app using your WiFi IP address
  4. Go to Ionic App Base directory and drag the “index.html” file to Evothings Workbench, that will create a new project entry
  5. Click “Run” button along­side the Ionic App Base project, that will load the app into your Evothings Client
  6. Open “index.html” of Ionic App Base in your favourite code editor to make a change, the Evothings Studio will immediately reflect the change in Evothings Client

As a final step towards running your own Ionic/Evothings app, explore the Ionic Framework docs to see which JavaScript, CSS components and tools are available to facilitate the development of your IoT app. Let’s suppose, you want to include tabs for multi­-page navigation in your app, for that, simply follow the tabs example in the docs section of Ionic Framework. For your ease of use, I have used the same example code and pre­-packaged it with Ionic App Base; just download the prepared Ionic kitchen sink example code directory from my Github account, drag the “index.html” into Evothings Workbench and press the “Run” button to see the tabs example running in your Evothings Client.

Get more examples and tutorials under evothings.com/developer, happy coding!

How to connect your phone to your ESP8266 module

$
0
0

It’s been roughtly a year since Hackaday published the article “New chip alert: The ESP8266 WiFi module (It´s $5)”. At the time of the publishing of that article not much was known about the cheap WiFi module – however the the module was received with open arms by the maker/hacker community and a lot of interesting community projects have been created using it.

esp8266

Today there is an Arduino-compatible SDK available and many more libraries have been written for it. If you have not heard of this module before; it is a 802.11b/g/n WiFi module. Equipped with SPI and UART interfaces; and depending on the module, a number of GPIOs. You can find the module for as low as $2 on eBay. The price point makes it perfect for makers that want to explore the Internet Of Things. We love great (and cheap) hardware and figured it was time for us to create an example where we made use of the ESP8266-module.

There exists a couple of different modules based on the ESP8266, they are named ESP-XX, where XX is a number ranging from 01 to 13 (as of publication of this tutorial). This example was developed using the ESP-01 module however it should run on any other module, but the external circuit required might differ depending on which module you use.

You can read more about that on this ESP8266 Wiki.

Source Code

You can browse the source code for this tutorial in the Evothings GitHub repository.

What you need

You will need the following hardware:

  • 1 x ESP-01 (or any other compatible module)
  • 1 x Breadboard
  • 1 x USB – Serial cable (3.3 volt)
  • 1 x Power supply (3.3 volt)
  • 2 x 10 kOhm resistors
  • 1 x LED
  • 1 x 1 kOhm resistor
  • Breadboard jumper cables
  • An iOS or Android smartphone

Step 1 – Hardware

The first step is to connect the module to your computer so that you can upload the firmware to your module. Since this module is a bit simpler than the ordinary development board you need to perform some additional work in order to connect it to your computer. Also the module runs on 3.3 volt and will fry (not good) if you try to use it with 5 volt signals. So first step is to ensure that your USB – Serial cable communicates using 3.3 volt signals. Among supported chipset you will find the FT232RL and CP2102.

The GitHub repository containing the Arduino SDK provides a guide on how to connect your module to your computer.

Connect the LED and the 1 kOhm current limiting resistor in series to the pin named GPIO2.

Step 2 – Embedded software

Preparation

Compared to a lot of other hardware vendors the community around ES8266 has really managed to simplify the installation of the SDK. Using the newly introduced board manager functionality in the Arduino IDE it is extremely simple to install the SDK. Download the environment from the Arduino website, open the Preferences window and enter the following URL

http://arduino.esp8266.com/package_esp8266com_index.json

into the Additional Board Manager URL field. Then open the Tools -> Board -> Boards Manager menu in the IDE and install the esp8266 platform SDK. It’s as simple as that.

The official installation guide can be found on the Github repository.

Source code

The Arduino sketch described below can be found on the Github repository.

This application connects to a WiFi network and configures a TCP-server listening to the port 1337. The server responds to the command H (High) and L (Low) which turns on respectively turns off the LED connected to GPIO 2.

The first thing you need to do is to add your WiFi credentials to the sketch. Find the ssid and password variables in the sketch and replace them with the relevant strings required for your network.

In the setup() function we configure the serial and WiFi connection. GPIO 2 is configured as OUTPUT so that it can control the connected LED. Finally the application starts to listen for connections on port 1337. The entire setup() function is provided below.

void setup(void) {
  
  Serial.begin(115200);
  WiFi.begin(ssid, password);
  
  // Configure GPIO2 as OUTPUT.  
  pinMode(ledPin, OUTPUT);  
  
  // Start TCP server.
  server.begin();
}

In the loop() method we first ensure that we are connected to the WiFi provided, and if we for some reason lose our connection we try to reconnect. During the development of this example we lost the connection from time to time – but that may be because of our over wireless working environment. You will probably experience that as well, the root cause can be difficult to detect when working with radio enabled systems – so adding the following code snippet will guarantee that the module is connected to the WiFi if it is within reach.

 if (WiFi.status() != WL_CONNECTED) {
    while (WiFi.status() != WL_CONNECTED) {
      delay(500);
    }
    // Print the new IP to Serial. 
    printWiFiStatus();
  }

The last part of the software checks if there is any new clients connected and if they have sent any data. If there is data available it is read and interpreted. If a H is received the LED is turned on and if a L is received it is turned off. Any other characters is simply ignored. The implementation follows here.

WiFiClient client = server.available();
  
  if (client) {
    
    Serial.println("Client connected.");
    
    while (client.connected()) {  
      
      if (client.available()) {
        
        char command = client.read(); 
        
        if (command == 'H') {
          
          digitalWrite(ledPin, HIGH);
          Serial.println("Led is now on.");
        }
        else if (command == 'L') {
          
          digitalWrite(ledPin, LOW);
          Serial.println("Led is now off.");
        }        
      }
    }
    
    Serial.println("Client disconnected.");
    client.stop();
  }

Unfortunately we had some issues with the code above, from time to time it seems like the client.connected() execution failed to detect when a client disconnected. In an more advanced implementation a watchdog timer could have been implemented along with the method call. If you do experience any issues, try restarting the module or consider implementing watchdog functionality into the sketch.

Step 3 – Mobile application

Preparation

If you already have a fully working copy of Evothings Studio you can skip this step.

First you have to install the Evothings Studio on your computer. The Evothings Studio consists of two softwares that interacts with each other. You have to install Evothings Workbench on you computer. The software provides you with the interface in which you will perform your development. You also have to install the Evothings Client (iOS, Android) on your smartphone. The client will connect to the workbench and execute the mobile application.

Source code

The source code can be found on the Evothings Github repository.

The application is designed to interface with the previously implemented TCP-server on the ESP8266 and sends commands to turn on and off the LED connected to GPIO 2. The application consists of three files, index.html which contains all the code connected to the user interface, app.css which contains the style sheets of the application and last, app.js which contains all the logic of the application.

esp8266-controlView

The user interface consists of three views, startView, connectingView and controlView. The startView, see above, contains the initial view that is displayed to the user when the application is launched. It makes it possible to connect to the ESP8266 module using its IP address. The connectingView is displayed when the user tries to connect to the board. And finally the controlView, see below, which is displayed once the application is connected to the module and ready to send commands.

esp8266-startView

The logic behind the application should be fairly straight forward and described in the following paragraphs. The method app.connect() is executed when the user presses the connectButton. The implementation the tries to open a TCP socket to the ESP8266 module, if it succeeds it displays the controlView. The implementation can be viewed below.

app.connect = function() {

	var IPAddress = $('#IPAddress').val()
	console.log('Trying to connect to ' + IPAddress)

	$('#startView').hide()
	$('#connectingStatus').text('Connecting to ' + IPAddress)
	$('#connectingView').show()

	chrome.sockets.tcp.create(function(createInfo) {

		app.socketId = createInfo.socketId

		chrome.sockets.tcp.connect(
			app.socketId,
			IPAddress,
			app.PORT,
			connectedCallback)
	})

	function connectedCallback(result) {
	
		if (result === 0) {

			 console.log('Connected to ' + IPAddress)
			 $('#connectingView').hide()
			 $('#controlView').show()
		}
		else {

			var errorMessage = 'Failed to connect to ' + app.IPAdress
			console.log(errorMessage)
			navigator.notification.alert(errorMessage, function() {})
			$('#connectingView').hide()
			$('#startView').show()
		}
	}
}

When a connection is established the application awaits user interaction, the user can either touch the circle that represents the LED or press the disconnect button. If the circle is pressed, either app.ledOn() or app.ledOff() is invoked – depending on the state the LED is in. The implementation of these two methods can be found below.

app.ledOn = function() {

	app.sendString('H')
	$('#led').removeClass('ledOff').addClass('ledOn')
	$('#led').unbind('click').click(function(){
		app.ledOff()
	})	
}

app.ledOff = function() {

	app.sendString('L')
	$('#led').removeClass('ledOn').addClass('ledOff')
	$('#led').unbind('click').click(function(){
		app.ledOn()
	})
}

If the disconnectButton is pressed, the app.disconnect() method is invoked, this method simply closes the socket and switches to the startView again.

app.disconnect = function() {

	chrome.sockets.tcp.close(app.socketId, function() {
		console.log('TCP Socket close finished.')
	})
	$('#controlView').hide()
	$('#startView').show()
}

To keep this tutorial simple we have omitted the fact that a command sent might not be received by the ESP8266 module. A simple way to ensure this is to implement a acknowledge functionality, every time the ESP8266 either turns on or off the led it should acknowledge success or failure by sending a message to the application. In the example we did previously for the MediaTek Connect 7681 we implemented a simple version of such a functionality, if you are curious you can have a look in that source code which can be found on our Github repository.

Summary

This tutorial shows you how simple it is to develop a mobile application for your ESP8266 module using nothing but HTML5 and JavaScript. This is our small contribution to the ever growing open source community revolving around the cheap and powerful ESP8266 modules. Feel free to use the source code as a starting point for your project. Order a couple of ESP8266 modules, download Evothings Studio and start working on your own project today! We can not wait to see what you will build, please share any creations with us @evothings.

Controlling a javascript Lunar Lander game on your phone from the Arduino

$
0
0

Learn how to connect an Arduino with Ethernet connectivity to your smartphone in this developer’s article. You’ll find how to use events from a physical pushbutton and a variable resistor to control the gaming action. Remember Lunar Lander? Lucky for us so did Seb Lee-Delisle, an award-winning digital artist who has put together a strictly non-commercial tribute clone of that original program, partly for our enjoyment as part of his project Lunar Trails, and also because that javascript is so much fun. You can find the “orignal clone” on the web right here [http://moonlander.seb.ly/]

During one of our offside Fridays at Evothings, we decided to contribute a bit to this open project by running the Lunar Lander run on either an iOS or Android device, while controlling it over the house network using an arduino UNO with Ethernet shield attached to the router. And rather than running a fat server on the microcontroller or the phone, we wanted to use raw TCP communication, i.e. the Chrome sockets, to thereby minimise the network overhead and with that reduce the latency so we don’t unnecessarily crash our lunar lander. I wrote this walk-through as I thought you’d like it too. And Fredrik did most of the real work.

Here is a sketchy video of what to expect once you’re done, with Fredrik’s narration:

Let’s get to it

To get started, you need an Arduino UNO or compatible, a pushbutton, a variable resistor (our potentiometer is a standard 10kΩ one), some patch cables and a breadboard. For connectivity we’ve chosen an Arduino Ethernet shield as they are relatively common and reliable. See the Fritzing sketch below, connecting the potentiometer to analog pin 0 and the pushbutton to digital pin 4.

fritzing_sketch_lunar_lander

After finishing your hardware set-up, you’ll want to download both the Arduino sketch (.ino file) and the mobile app from our contributers’ demo repo on Github. If you’re unfamiliar with cloning using git, you can instead download the demos folder as a single zip file, and after unzipping open up the Lunar Lander demo folder. Inside of it, you’ll find one subfolder for the app and its sibling folder with the Arduino code. Note; All code for this example is released as is, sans commercial or other proprietary intent, copyleft on our contributor’s Github repository.

Now we have all the parts, time to get to work with the software!

Upload to the Arduino microcontroller

Take the arduinoeth.ino file of the project and run it in the Arduino SDK on your desktop computer, remember to select the correct profile under “Tools>Board”, and the right USB port under “Tools>Port” before uploading.

In the Arduino SDK, open the “Tools>Serial Monitor” to see what’s going on and how the Ethernet shield is doing in terms of picking up an IP address. If you don’t have a network that gives you an address (DHCP), you’ll need to uncomment, as well as change lines 48-50 in the arduinoeth.ino file. Make a note of the IP address of the Arduino from the monitor window, as it’s going into the app code shortly.

Setting up Evothings Studio

Download Evothings Studio to your computer, and the corresponding app for your phone:

(Option 1, recommended for Production) Download Evothings Studio 1.2 Stable
evothings.com/download
The corresponding client is called “Evothings Client”, found on the public app stores.

(Option 2, for the brave at heart) Evothings Studio 2 Alpha
evothings.com/download-evothings-studio-2-0-alpha
The corresponding client is called “Evothings Viewer”, also found on the public app stores.

Here is a short video, on getting started with version 2. If you use Evothings Studio v1 Stable, here is a corresponding video here.

Configuring your app

You’ll find an index.html file containing the layout and container tag for the game. You can fill in the IP address of the Arduino here on line 128 with the id=”ArduinoIpAddress”, so you won’t have to add it in the interface with chubby fingers like mine. You also need to fill in the address in the app.js file (in this current version at least) to start the Socket chrome.sockets.tcp.connect function, on line 26. That’s it, you should be ready to go! Make sure the phone is on the same network as the Arduino device (turn Wifi on) so they can find each other.

Run, run, run

All you need to do to run the example, is to drag-n-drop the index.html file from your Lunar Lander project into the Workbench project window and press “RUN”. In the app, press connect and then press your pushbutton on the breadboard to make your first landing attempt. A small step for one developer, a giant leap for the kids when they see how fun video games were in 1979.

Joking aside, you should really use the ideas from this project, adapt to something useful and perhaps even develop further to meet you own requirements for sending data from a connected device. There are tons of other examples and tutorials on our developer site evothings.com/developer, for your Arduino and other hardware, APIs and cloud services. Feel free to join the action. We also have a forum on evothings.com/forum for any idea or question you might have. Happy coding!

Detecting Eddystone™ beacons in JavaScript made easy – Evothings presents the Cordova Eddystone plugin

$
0
0

Eddystone LighthouseEddystone™ beacons are coming to Cordova land! It is easy to write mobile apps for Eddystone beacons with the new open-source Eddystone Cordova plugin created by Evothings.

In this tutorial you will learn how to create a Cordova app written in JavaScript that can detect Eddystone beacons. The plugin can also be used with PhoneGap and Ionic.

Why Eddystone matters

Eddystone is the most significant beacon technology we have seen at Evothings to date. Google made the Eddystone beacon standard public in Summer of 2015. The release was accompanied by announcements of Eddystone beacons from major beacon vendors like Estimote, Kontakt.io, Radius Networks.

estimote-beacons-group-smallIf you have not heard of beacons before, a beacon consists of a small computer chip with a Bluetooth radio transmitter that sends out data signals for up to between 30 and 100 meters. This tiny “lighthouse” (Eddystone is actually a proper lighthouse in south-west England) allows mobile phone users to detect and access information about what is present at a physical location.

You can literally browse the streets by scanning for beacons nearby. Perhaps there is a shop or bar around that you did not know about? You might discover that a cool band is playing tonight at the restaurant where you are having lunch. You may even connect to machinery and services nearby via a beacon (perhaps get something to drink from a futuristic vending machine while exploring all the beacons).

Eddystone uses a different method for sending out signals than Apple iBeacon does. The difference is much more far reaching than you might first think.

Here is why Eddystone matters:

  • It is an open standard that can be implemented on any platform that supports the required Bluetooth Low Energy functionality.
  • By contrast, the iBeacon format offered by Apple is proprietary and not officially supported on Android.
    Support for scanning for Eddystone beacons is available on both iOS and Android.
  • Apps can scan for any Eddystone beacons, regardless of who made them or who placed them on location.
  • With iBeacon, apps need to be preconfigured with fixed beacon UUIDs, and Apple does not allow the user of an app to alter these ids.
  • Eddystone beacons can broadcast URLs, meaning that apps can directly act on beacon data sent out, without the need for having access to a lookup table of beacons ids on a server somewhere. This is much more flexible and open than iBeacon, which locks down the use of beacons to specific apps.
  • Depending on your application, Eddystone beacons can offer you game changing opportunities.

kontakt_ioFor example, as an owner of a shop or cafe, you can just place a beacon at the entrance door, configured with the URL to your web site – and anyone within 50 to 100 meters strolling by on the street with an Eddystone scanner/browser app can see your beacon and access your web site, facebook page, twitter flow or database of all the beer you have currently in store. This is a perfect way for people in a crowded city to get an overview of what is around them.

Other applications include providing information to travellers and tourists, hotel guests, visitors at museums, airports, hospitals, and so on. Industry applications can use beacons to track production, goods, spare parts, transports, etc.

After this introduction to bring you up to speed on beacons, we dive right into the hands-on tutorial for creating an Eddystone mobile application in JavaScript.

What you need

Beacons

You need at least one Eddystone beacon to test the app with. Beacons can be purchased from vendors like Estimote, Kontakt.io, Radius Networks.

rad_beacons


Or a beacon emulator

If you don’t have any beacons and want to get started right away, you can run an Eddystone emulator on your computer. Your computer must have support for Bluetooth Low Energy (BLE). There are several emulators around that send out Eddystone signals. One we recommend is node-eddystone-beacon (first install node.js to use this emulator).

Apache Cordova

To build the app you need the Cordova build system, which build tools installed for Android and/or iOS. Visit the Cordova documentation for how to get started. Also check out the Evothings Cordova Guide.

Step 1: Create a Cordova project

Once you have Cordova installed, create a new project with the following command:

cordova create myapp com.mydomain.myeddystoneapp Eddystone

This creates a Cordova project in the folder myapp, with the given app id, and the name “Eddystone”.

Step 2: Add the Eddystone plugin

Go into the app folder you’ve just created and add the plugin with these commands:

cd myapp
cordova plugin add cordova-plugin-eddystone

Step 3: Add platforms

Next add the mobile platform(s) you want to build for (iOS and/or Android):

cordova platform add ios
cordova platform add android

Step 4: Copy sample application code

EddystonePluginDemoApp
The Eddystone plugin comes with a sample application that displays detected beacons and their data. The code is contained in a single index.html file.

Go to the www directory in the Cordova project and open index.html in a text editor. Delete the contents of the file (so you get a blank document). Then copy and paste the code from the Eddystone example app, open the example code in raw format, which is easy to copy and paste.

Save index.html. Now we are ready to build and run the app.

Step 5: Build and run

On iOS build the app with:

cordova build ios

Open the generated Xcode project in folder platform/ios and launch the app on an iOS device (the emulator won’t work, as it does not support BLE).

On Android, build with this command:

cordova build android

This generates an Android Studio project. One way to install the app on an Android phone is to use the adb command:

adb install -r platforms/android/build/outputs/apk/android-debug.apk

When running the app you should be able to see any nearby Eddystone beacons appear on the display!

Dive into the code

A good starting point is to study the example code in index.html.

It is very easy to start scanning for beacons in JavaScript. Just call the startScan function:

evothings.eddystone.startScan(successCallback, errorCallback)

For example:

function successCallback(beacon)
{
   console.log('Jippie, found a beacon: ' + beacon.name)
   if (beacon.url)
       console.log('  url: ' + beacon.url)
}

function errorCallback(error)
{
    console.log('Darn, something went wrong: ' + error)
}

evothings.eddystone.startScan(successCallback, errorCallback)

Make sure to visit the plugin README file to learn more about the Eddystone JavaScript API.

Rapid beacon app development with Evothings Studio

This tutorial introduces the Evothings Eddystone plugin. In upcoming posts, we will show how to use Evothings Studio to develop Eddystone apps interactively, with a very quick turn around time.

If you wish to take a sneak preview, download Evothings Studio 2.0 Alpha, install the Evothings Viewer app, and run the Eddystone Scan example that comes with the Studio download. It is fun and easy to get started!


Finalizing an Evothings app for store submission PhoneGap Build

$
0
0

appstore_whirlOnce you’ve created a hybrid app using Evothings Studio, a natural next step is to publish your app on one of the public app stores. You can be traditional and build it yourself using the Cordova Command Line Interface (CLI) but why not utilise one of several companies offering build services. A good representative for these on-line services is Adobe’s Phonegap Build service. This tutorial explains the steps required to make your app available on Google Play, a similar workflow will also take the app to Apple’s iTunes and Windows Market.

For the purpose of this tutorial, we will be using something relatively simple; the Cordova Accelerometer example app which comes bundled with Evothings Studio. You can read the example tutorial here and download the related example code from Evothings examples Github repository. You can also use your own example app, while the steps required may vary slightly depending on the nature of your application. Let’s start building the example code into a stand­alone app using PhoneGap Build, and here we go:

Step 1: Create an Adobe ID to use PhoneGap Build

Sign up for an Adobe ID – for now you can choose free plan to sign up. After signing up, you will see a welcome screen with two tabs; one for open source apps and one for private apps. Under the “free” plan, you can build unlimited open source apps but they have to be pulled from a valid, publically available Github repository. You can alternatively build a private app, by uploading the JavaScript and HTML files to the build manager.

Step 2: Get the example source code

If your own app code is hosted at a public repository at Github, it can be pulled straight­ into the PhoneGap Build. In our case, the complete example code can easiest be found bundled in the examples’ catalog of Evothings Studio (we do have a public repo for all our bundled examples on Github, while you’ll find that these are source files and are processed further and resources are added each time Evothings Studio is built and shipped — you wouldn’t want use them as is).

So open Evothings Workbench on your PC, locate the example “Cordova Accelerometer” and click on related “Code” button, you will find the folder with the source code of this example.

Step 3: Identify your required plugins

If you are developing an IoT app, chances are that you may have used multiple plugins in order to access native device features such as camera or accelerometer.

While using PhoneGap Build, one has to specifically identify the required plugins via config.xml file even if they come as part of default Cordova package.

Since, our example app is using Cordova Device Motion and Cordova Vibration plugins, we will need to specify these plugins in config.xml file.

Step 4: Using config.xml for specifying the required plugins

acc_folderCreate a config.xml file inside the top level directory of  example source code, paste the following code into the file. File encoding should be UTF-8 for it to work properly, set this condition in your text editor under “Encoding” or similar.

<?xml version='1.0' encoding='utf-8'?>
<widget
   id="com.evothings.accelerometer-demo"
   version="0.0.1"   
   xmlns="http://www.w3.org/ns/widgets"
   xmlns:gap="http://phonegap.com/ns/1.0">

<name>Cordova Accelerometer</name>
<description> Testing the on-board accelerometer </description>
<author email="websmurf@evothings.com" href="http://evothings.com"> Webb S Murph </author>
<content src="index.html" />

<gap:plugin name="cordova-plugin-device-motion" source="npm" />
<gap:plugin name="cordova-plugin-vibration" source="npm" />

<access origin="*" /> <allow-intent href="http://*/*" />
<allow-intent href="https://*/*" />

<platform name="android">
   <allow-intent href="market:*" />
</platform>

<platform name="ios">
   <allow-intent href="itms:*" />
   <allow-intent href="itms-apps:*" />
</platform>
</widget>

Notice the “gap:plugin name” element, that is where we are notifying the build process that it needs to include Cordova Device Motion and Cordova Vibration plugins while building the app.

phongap_screenshot

You can also provide a version number of the plugin, if not provided, the build process will fetch the latest version of the plugin. If you want to know more about including plugins the build process, PhoneGap Build docs have all the required information.

You can also add other intents, depending on what URI schema you’re attempting to use:


<allow-intent href="tel:*" />
<allow-intent href="sms:*" />
<allow-intent href="mailto:*" />
<allow-intent href="geo:*" />

You can make a far more elaborate configuration file, and the guys Adobe have put together a kitchen sink version for this purpose, with most of the required bells and whistles right here.

Step 5: Upload a zip file containing your Evothings app folder

Now zip the example app folder and upload it to the PhoneGap Build as a private app. After the upload is completed, you will be asked to enter the name of your app along with it’s description, on the same page you will also see options to Enable Debugging and Enable Hydration. You can read about these options in PhoneGap build docs, however, for now just leave them unchecked, supply the app name and description and click the “Ready to build” button.

On the next page, click on Android tab, you will be presented with the App details page. On this page you will immediately see iOS build throwing an error while with Android and Windows Phone build results you may see a spinning wheel, after some time the spinning wheel will go away and buttons to download xap for Windows Phone and apk for Android will appear instead. You can either click on these buttons or use the “install” button above to download your app from the PhoneGap build service. Your app build is now complete, you can use it for local distribution or for your own private use (don’t forget to allow apps from “unknown sources” on your Android phone). You can also click on the “Collaborators” tab to send the build to testers and other members of your team.

Step 6: Signing the app for app stores

The iOS build intrinsically failed because PhoneGap Build needs a signing key from Apple Developer Centre to compile an app. Similarly, the Android and Windows Phone apps are also not ready for upload as the required signing key for Android and required publisher key for the Windows Phone app store were not supplied before the build process.

Signing keys are necessary to protect the app from misuse and certify it has the correct origin, you can generate these keys by obtaining the membership of the respective app stores. For further information about generating signing keys, follow these links:

Once, you have obtained your publisher key for Windows Phone and signing keys for iOS and Android respectively, you can add them to your app in PhoneGap Build account, rebuild and download the signed app that you can now upload to respective app store.

No more IoT tears – how to upgrade the Arduino Wifi Shield painlessly from a mac

$
0
0

reset--buttonOk, so you got the message in the Arduino SDK saying “Please upgrade the firmware” or “WiFi shield not present”, then it’s time to carry out an upgrade.
Don’t worry, everything you need is in the Arduino SDK package already on your computer. You’ll need to use the terminal, and to have root access
to your computer. This how-to is on how it’s done on a Mac, but you’ll hopefully find it useful if you’re on other computers too. I’m taking it step-by-step so also newbies can do it without calling their IT support, dad, son or 10yo neighbour’s kid.

This is how to go about it:

  1. Set the jumper J3 (see pic) by bridging the two pins with the plastic connector. Then connect the shield (not the Arduino) to a mini USB 5 pin connector cable (a.k.a B5), the Arduino can be unplugged while you don’t need to take the shield off physically, plug the other end in the computer. If the red LED goes on, you can press the shield’s reset button – the white button in the upper-left corner of the shield.
  2. Open the Terminal.app on your mac, go to the upgrade script: In my installation, the scripts for upgrading hardware was found here (my commands in bold typeface):
  3. cd /Applications/Arduino.app/Contents/Java/hardware/arduino/avr/firmwares/wifishield/scripts
  4. Once in that directory, make the script executable with the chmod command, you will need to be root or a user to carry out this operation:
  5. sudo chmod 770 ./ArduinoWifiShield_upgrade.sh
  6. You’re now ready to run, take a deep breath. You can either only upgrade the shield (-f shield) or both the antenna firmware and the shield (-f all). My Arduino installation carrying the /hardware folder and its scripts, firmware et cetera is located at /Applications/Arduino.app/Contents/Java/ which you also need to state, as it defaults elsewhere. This is the command to use. Check with pwd command that you’re in the scripts folder, or it will fail:
  7. pwd
    /Applications/Arduino.app/Contents/Java/hardware/arduino/avr/firmwares/wifishield/scripts
    
    sudo ./ArduinoWifiShield_upgrade.sh -a /Applications/Arduino.app/Contents/Java -f all
  8. Follow the on-screen guide in the Terminal window, press RETURN and the reset button when asked, and you’re done. Unplug the mini USB from the Wifi Shield, and resume normal operation.

This guide has been inspired by this webpage on upgrading the shield firmware, along with some trial-n-error so you won’t have to. It also describes how to install the dfu-programmer executable if you don’t have one onboard in a good way.

Eddystone™– Bluetooth Beacons are getting smarter

$
0
0

By Thomas Larnhed

estimote_beaconsEddystone™ is one of the most interesting developments on the Internet of Things right now. What is Eddystone and what can you do with beacons? Read on to learn more about this new exciting Internet of Things technology that can fit in your pocket and be applied to virtually any place or object around you.

Beacons are cheap, simple and need very little energy

What people think of when talking about beacons is probably small stand-alone devices with a little battery and a big self-adhesive patch on the back. But a beacon chip can be made very tiny and be embedded anywhere, in a bicycle frame, on a shopping cart, by a coffee machine or in-seam in a duffel bag. Smartphones, iPads, laptops or USB sticks can also be programmed to act as Beacons.

A beacon is really just a little Bluetooth chip configured as a small radio that can transmit certain kinds of messages over short ranges (up to 100 metres). A basic beacon has limited capabilities and only uses around 10% of the Bluetooth specification. It only transmits, and never receives any data (apart from when you configure it as a technician). It is one-way and never knows by itself if anyone is listening to it.

Since the chip only sends data at intervals, it also consumes very little power. Stand-alone beacons often have batteries that can last anywhere from months to a couple of years. They are small, rather uncomplicated devices, cheap to manufacture and can be bought for less than $10.

Almost all smart phones, iPads, laptops and similar devices have built-in Bluetooth capabilities and can pick up signals from beacons.

So – a beacon is cheap, simple, needs very little energy, can be embedded anywhere and transmits messages to any mobile phone, pad or other device that wants to listen and is close enough.

Now, this opens up for a lot of exiting possibilities…

A real-world example

Imagine that you are shopping for a new TV. You are walking around in the TV department in your local store, browsing among the available 10-15 different models. After a while, you get a message on your mobile phone that says you can have a 10% discount on one of the models. What is striking is that this happens to be one of the models you like the most – perhaps even your favourite one. You also get more detailed information about this model and since you are a member in the loyalty program of the store you also get an even larger discount if you combine the TV with other products.

Eddystone TV ShoppingIn this example a beacon has been placed by each TV model and an app has simply measured how much time the customer lingers around each model.

Retail is probably one of the application areas most frequently discussed and where most applications have been built so far. But there are many other use cases, and industries hooking up to beacons. A bus stop could send out transit times and time tables, a museum can give you information about the exhibit you are standing in front of, a restaurant can send the menu to people passing by, a vending machine can call for maintenance when it needs refilling etc.

Use cases can be found anywhere people are in need for relevant information based on their currently physical location. Airports, zoos, amusement parks, concert halls, sports stadiums, shopping malls, hospitals, warehouses… Everywhere where there are people, there is an opportunity for communicating and reaching out using beacons.

iBeacon – the Apple standard

iBeacon has been around for two years and is Apple´s standard for beacons. There are already several successful applications built, especially in the retail sector. Large installations have been deployed by for instance Macy’s in New York City and San Fransisco, and by Tesco in London.

The iBeacon format officially only supports iOS and is technically rather limited.
The only thing a Bluetooth chip configured as an iBeacon transmits is an ID that identifies it, along with some parameters to identify a particular beacon from a group. It essentially says something like:

“My identity is x and you are within my range”

The ID can be used by an app to push notifications to the user, call a server to get additional data, or basically anything you can envision when designing the application. The important thing is that you get the resources from a server, not from the actual beacon itself.

You can also check the signal strength of the beacon and determine the approximate distance to the beacon:

“My signal strength is xx”

The computed distance based on the signal strength typically fluctuates due to variations and disturbances in the radio signal, but it is possible to get an approximated range estimate, typically reported as one of “immediate”, “near”, and “far”.

Eddystone – beacons are getting smarter

Eddystone LighthouseEddystone is Google’s answer to iBeacon and was released Summer of 2015. It is a completely open standard named after a famous lighthouse in the English Channel.

Eddystone works both on Android and iOS, and since it is an open standard it can and will work on any Bluetooth Low Energy capable platform.

In contrast to iBeacon and other earlier variations on beacons, Eddystone has what is called multi-frame support which means it can send different types of information packets, not only an ID. It can transmit URLs, and it has frames that enhances privacy and security. There are also frames with information about the beacons themselves, such as remaining battery life and temperature, which will make it easier to maintain large beacon installations.

The capability of Eddystone beacons to broadcast URLs is groundbreaking. This might not sound like such a big deal, but it has in fact the potential to change the mobile app landscape as we know it, and actually bypass the need for a specialised app all together.

Imagine the TV shopping example above. In this example the customer needs to have an app installed on her phone. The beacon sends out the ID and the app translates it to something meaningful. But customers might not want to have apps for each store they visit. When beacons are sending out URLs it would be enough for the customer to have a browser running. The URL can then trigger a web-based app and you could do anything from there. With iBeacon this scenario is not possible, each app needs to have a list of beacon IDs it can listen for. With Eddystone, however, you can literally browse beacons by moving around. Eddystone enables the physical web.

Eddystone beacons can sense their environment

Since different types of data can be included in an Eddystone broadcast, this will be a very effective way to communicate all kinds of sensor data. We will see many types of beacons on the market that in turn include all kinds of sensors and broadcast sensor data.

Some manufacturers, like the Kraków based company Estimote, are already building in sensor capabilities in their beacon hardware and we have only seen the beginning of this. A beacon device could for instance have an accelerometer, a temperature sensor, a humidity sensor, a magnetometer, a barometric pressure sensor, a motion detection sensor, a light sensor, and the list goes on.

Beacon functionality combined with new BLE devices opens up for many very interesting and useful applications. Still very low-cost, very low-energy and rather easy to build.

But how do you take advantage of all this new potential for your business or pleasure? How would you best build the services and mobile applications?

Developing with Eddystone beacons

Google integrates beacon functionality in their own applications and services such as Google Maps, Google Now and Google Wallet, which Android users most likely already have installed on their phones. There is also experimental Eddystone integration in Google Chrome which can be set up to scan for Eddystone URLs from within the browser. The idea is that the user should not have to download specific apps for every application. Yet, you’re limited to what’s allowed from within the browser sandbox.

What if you wish to make custom business applications, either for consumers or for industrial use. Or if you wish to build your own Eddystone Browser and add more of the phone’s native functionality? Would it be possible to develop cost effective beacon solutions in-house?

For these occasions, a hybrid solution, with part native and part web functionality can be the answer to your needs. Evothings Studio 2.0 is a professional development environment for building cross-platform mobile applications that leverage the possibilities of the Internet of Things (IoT). Mobile apps are developed and maintained within Evothings Studio using standard high level web-technologies – using common languages like JavaScript and HTML5 in conjunction with Evothings own libraries and prototyping tools.

Application logic is written in JavaScript, and Evothings Studio and its underlying components take care of all the low level details of developing apps for beacons and connected devices. Apps built with Evothings Studio use industry standards and can be published on the app stores as native Android and iOS applications.

Get started with Eddystone app development in just 5 minutes

workbench-client-evothings

Feel free to download and try out Evothings Studio. It is easy to get going. It just takes 5 minutes to started with an Eddystone app on your mobile phone. Just run the “Eddystone Scan” example app and get your hands on some Eddystone beacons and you’re up and running in no-time!

Listing Arduino SD file names with Evothings and Ionic

$
0
0

In this tutorial we will use Ionic Framework together with Evothings Studio to develop an app for listing the filenames in a SD card directory on the Arduino Wifi shield. Evothings Studio is designed for rapidly prototyping IoT apps using JavaScript and HTML. You can also use all the latest front-end JavaScript and HTML5 frameworks on the market, as well as plug-in functionality available for Apache Cordova, Phonegap and its clones. The Evothings Viewer comes bundled with functionality like low-level networking (Chrome sockets), Bluetooth Smart and support for on-board and external sensors.

The Ionic Framework

Traditionally, JavaScript frameworks, like jQuery and Sencha, were originally developed to be used in browsers running on desktop computers. These legacy frameworks worked relatively well on desktops, but they were sluggish and jittery with big footprints, if we tried to use them on smartphones. Thanks to the advent of modern web technologies like Angular and React as well as mobile incarnations of previously mentioned libraries, we have access to high-level UI frameworks which are optimized for the mobile experience. Ionic is also such library offering mobile­ optimized HTML, CSS and JavaScript components. It’s modeled on native mobile SDKs and uses Angular as a supporting technology to develop and deploy feature rich and robust mobile apps.

Getting started

One way to get started with Ionic is to use the command line interface (CLI) commands, however for that you will need to install both Cordova and Node.js, which is exactly where Evothings Studio comes in. It allows you to rapidly prototype your mobile app without doing lengthy installations of various packages and libraries.

To develop this example with Evothings Studio yourself, just follow these steps:

  1. Download Evothings Workbench from the Evothings’ website
  2. Download an Evothings Viewer app for your Android and/or iOS phone (search for "evothings" at the public app stores)
  3. Start the Workbench, and generate a connect code under the “Connect” tab, and “Get Code”
  4. Connect the Evothings Viewer app by punching in the text input field. At this point you can test the connection by Run:ing any “Example”, like Hello World, to see it load in the Viewer app
  5. Download the Ionic app template from my Github demo repo. I host several demos of mine in a single repo, so if you are new to Github, you might want to just download the whole zip file and locate this example once unzipped.
  6. Open the www directory of your download and drag & drop the index.html into the Evothings Workbench window, which will create a new Evothings Studio project entry at the top of the list!
  7. Click the corresponding Run button, which will load the app in the Evothings Client
  8. Open index.html in your favourite code editor and type "Hello World!" under ionic-
    content
    tag, when you save the change, it will instantly reflect on your phone.

Prepare the hardware

We will need following components:

  • one Arduino Uno or compatible
  • one Arduino WiFi Shield (if you have another shield from a 3:rd party, the wifi library you need to include may be different)
  • a microSD card and some random files

sd-ionic-hw

List of things to do:

  1. Create a folder on your microSD card’s root directory and name it e.g. MUSIC
  2. Copy some files to the /MUSIC folder, using a card reader on your PC
  3. Insert the microSD card in its slot on the WiFi Shield
  4. Stack the WiFi Shield over your Arduino
  5. Open the Print_SD_Files_WiFi_Shield.ino file using the Arduino IDE and change the ssid[] and pass[] variables to suit your own network name and password.
  6. Connect the Arduino to your computer and upload the sketch

Code walkthrough

First some compulsory headers and includes for the WiFi and SC card libraries respectively. It then starts a TCP server, listening on 3300 port

WiFiServer server(3300);

In setup the CS pin (chip select, see https://en.wikipedia.org/wiki/Chip_select) needs to be defined as an output pin

pinMode(10, OUTPUT);

Once in the main loop(), once the connection is received, the server reads each request and passes it to executeRequest():

	char command = readCommand(request);
	int n = readParam(request);
	if ('R' == command)
	{
                root = SD.open("/MUSIC");
                printDirectory(client, root, 0);
	}

If the request contains “R” command, the sketch opens the SD card directory and passes its reference to printDirectory() function which in turn iterates over the directory’s files and print their names back to the TCP client. One word of caution will be to take care of Arduino’s SRAM as Arduino Uno just have 2K of SRAM available, if we use too many constants or print too many literal strings over Serial, the SRAM fills up quickly. If the available SRM is under 400b then the sketch will iterate over SD card directory but is unlikely to print the names to the TCP client.

Developing the SD Card File List app

We need two UI views for our app, in one view we will use the Ionic Forms component to input the IP­ address and connect to the Arduino, in second view we will use Ionic List component to list the SD card file names.

I have uploaded an example app in my Git repository. As previously, download or clone the example, unzip and drag the www/index.html to Evothings Workbench and click Run to view the app in Evothings Client. Next, give the IP­address of Arduino (you can see it in Serial Monitor window of Arduino IDE after uploading the sketch), press Connect, the app will move to the list view, listing the names of the files present in /MUSIC directory of your microSD card.

One thing to note here is that app is not printing the full names of the files, it’s because of FAT file naming limitations. You can read more about them here. That’s it! You have your app ready!

sd-ionic-app

Hybrid development aiming to bridge the bluetooth smart fragmentation gap

$
0
0

bluetooth
Communicating over Bluetooth Smart is an increasingly common way to connect to hardware using a mobile device, and more specifically one of the few methods which doesn’t require additional network components, wifi hotspots or login credentials. In this article, the objective is to introduce novel ways for developing bluetooth apps for modern smartphones and tables.

Bluetooth Smart, a.k.a (BT Low Energy or 4.0), is perhaps most importantly known as a growing industry standard under bluetooth.org. This radio technology has many traits of a vanilla connection tool for all things connected; allowing operation on pretty much all modern handsets, while lean enough for devices feeding from coin-cell batteries and even energy–harvesting devices embedded in constructions. Interestingly, it’s also fairly secure with its built-in 128-bit AES data encryption.

Up until recently, access to the Bluetooth subsystem was a privileged path for native developers with skills on each respective platform, and especially mobile devices running Windows were late for the party as Bluetooth Smart support was made available only the last 2-3 years. In practice, this means that Bluetooth Smart is fragmented both on handsets and connected peripheral devices. A path to bypass differences in bluetooth implementations is via hybrid applications – partly written in native code and partly made up in HTML5 and JavaScript – the very same code executes seamlessly on any Bluetooth capable handset.

In practice this means that the Bluetooth radio would be running native, while the connection and data flow is established and controlled by the code running inside the container. This approach, commonly practiced in Evothings Studio, creates a simple layer of abstraction virtually eliminating the need to learn the intricacies of each operating system, keeping the low-level programming down to a bare minimum, typically a selected number of function calls using scripts.

Starting a scan for Low Energy devices (e.g. using EasyBLE by Evothings), is made by a single function call:

evothings.easyble.startScan();

Finding a device and retrieving a list of its services, is equally straight-forward:

device.connect(function(device)
     {
                  evothings.easyble.showInfo('Status: Connected - reading BLE services...');
                  evothings.easyble.readServices(device);
     });

New versions of bluetooth also come with an increased range of functions, allowing devices to act as both a beacon and connected device concurrently, and create mesh data networks as we’ve seen as one of the main advantages with Zigbee network topologies. How can we assure that mobile applications behave consistently as technology evolves? Backwards compatibility is easier to obtain, than maintaining a set of future proof native plugins. Our philosophy at Evothings is to uphold basic functionality in native code (or Java in the case with Android), and have the intricate details instead expressed in JavaScript to maintain a dynamic behavior. A language like JavaScript allows later binding between an application’s building blocks and accents than languages based on Java or C/C++. This brings on many advantages for development, testing, prototyping as well as for deployment to end users.

Welcome to explore the Bluetooth Smart world with us. For more examples with connected hardware and other bluetooth use cases, you can acquire the Evothings Studio software bundle at evothings.com, as well as additional reading on bluetooth and hands-on tutorials with a number of popular hardware and devices. Mobile clients for development are free, and published on the public app stores (search for “evothings viewer”).

estimote-beacons-group-small

Eddystone is coming, opening up the beacon space

Estimote, Kontakt and Radius Networks are upping their hardware offerings, here’s some things you need to know.
Read more→

cloud_two

Evothings Studio is coming cloud-side

Evothings Studio 2.0 is a Saas solution, bringing on a wide array of possibilities for the IoT developer. Join the growing community and learn more about the benefits of using Evothings Studio for IoT app development.
Read more and download→

appstore_whirl

Ready for app stores with Phonegap Build

A simple way to build and sign your app for publishing, is via a build service. Adobe’s Phongegap Build is one for the candidates.
Read more→

Evothings Studio 2.0 Beta 2 released!

$
0
0

Stockholm Dec 10, 2015

DSC_0609We’re very happy to announce that Evothings Studio 2.0 beta 2 is available for general access. This is a production-grade release, and we recommend using it for customer work. In fact, development of IoT apps has never been faster, with a growing number of templates and examples for various IoT devices, systems and cloud services.

We’d like to thank everyone providing feedback on requirements for development, testing and production of apps suitable for the Internet of Things and connected devices and services. We’re confident that Evothings Studio is ready for the world and is in stable operation, with running service in US, Asia and Europe!

Improved UI

This release is revamped with an overall improved UI of the Workbench with clearer instructions and new functionality for a simpler workflow, and adding more persistence to connected desktop and mobile clients.

Being able to make copies of projects in a single click makes it easier to take bundled examples and making them your own. Another feature allows you to create new projects using a basic template, alongside drag-dropping your own projects into the Workbench. The developer’s own apps are collected under “My Apps”, for easy access. For safety, alerts are introduced when removing a project (i.e. unlinking, we don’t touch any real files) from your list.

Connect from anywhere

Since Evothings Studio took the leap from being a stand-alone download for desktops, to a full-fledged hosted service, it’s become so much easier to connect mobile clients to the developer Workbench, regardless of network or method for internet access.

You can mix and match clients running on 3G/4G with WiFi as well as other tethering options. It also allows developers, testers, clients and others to access and run the same code, with multiple devices for easy distribution of project contents and comparison between devices.

Share your thoughts with us!

evoteam_seWe love feedback. We’ve created a menu option straight under the Workbench main menu, for developers to use. In addition to our regular web forum we have also started using Gitter for community live chat. Just use the link below to join our Gitter chat room right in your browser!

 

Download Evothings Studio and get started!
evothings.com/download

Release notes for Evothings Studio
evothings.com/doc/studio/release-notes.html

Questions? – Visit the Evothings Forum
evothings.com/forum

Our Gitter main room, where developers can hang out
gitter.im/evothings/evothings

Our IRC channel on Freenode
webchat.freenode.net/?channels=evothings

IoT business opportunities using Eddystone™ – cost effective sensor technology and fast development of mobile applications with Evothings Studio

$
0
0

Visualizing sensor data is fun!
[Image credits]

If you are at a company that develops products taking advantage of IoT (Internet of Things) technologies, you should explore the possibilities of the Eddystone beacon protocol. Eddystone™ is a new beacon standard from Google that opens up for a wide range of engaging and groundbreaking use cases and business ideas. In this article, we will explore use cases for beacon-based sensors that people interact with using a mobile application.

Beacons broadcast data – apps listen, users act

Eddystone is an open-sourced beacon protocol developed by Google. A beacon in this context is a small hardware radio device that broadcasts data over Bluetooth Low Energy (BLE). Typical ranges of the radio signal is up to 20 or 100 meters (60-300 feet).

The size of a beacon is usually around 3-5 centimeters (1.2-2 inches), so it’s easy to fit it in many applications and contexts. Most industrial-grade beacons in production today are “stand-alone” devices powered by a battery, that typically lasts for at least a year. Stationary beacons can be powered by the electricity grid. Alternative power sources are certainly possible, like solar power or kinetic energy.

A common use case for beacons is users who have mobile phones with an app that detects beacon signals. The app can alert when the user is in close proximity to a beacon, display which beacons are within range at a given time, and show information related to a nearby beacon.

Beacons and sensor data

A basic Eddystone beacon can broadcasts an ID or a URL that a mobile app can use to display information related to the beacon. Standard Eddystone beacons also typically transmit temperature data. An app can detect the approximate distance to the beacon, and rudimentary triangulation using signals from multiple beacons is possible. In addition, a beacon can be equipped with various sensors and send out sensor data to nearby apps.


These floating sensors are as far as we know not beacon equipped, but they very well could be.
[Image credits]

The Eddystone protocol is extensible and can be made to support sensor data. Eddystone is an open standard, and it is possible to extend it with custom frame types. A frame is a packet of data broadcasted by the beacon. Each frame may contain only a limited amount of bytes, a restriction set by the underlying BLE standard. The solution is to broadcast different types of data in a sequence, using different packets or frames.

Sensor data for temperature, humidity and light could for example fit into a single frame. There could be another frame type for accelerometer data, for applications that need that. And so on for any type of sensor data.

Extensibility is a strength of the Eddystone standard. If you develop custom hardware, you can program the BLE module to send out the data your hardware supports.

In the coming 6-12 month, we’ll likely see standardisation starting to happen regarding new frame types for Eddystone. Meanwhile, you can use custom frames designed for your hardware.

Creating a mobile app that picks up sensor data

An IoT-based product has two parts, hardware sensors/beacons and a mobile application. The mobile app is like a remote viewer on the sensors. The app receives Eddystone frames and visualises sensor data.

In the code that handles Eddystone broadcasts, the app needs to decode the custom frames, getting data out of the beacon, to be useful in the app. In the case of apps developed with Evothings Studio, you just add snippets of code as you see fit to the Eddystone JavaScript library.

Using Evothings Studio and JavaScript means that you can directly modify the code and see the results instantly on a mobile phone. No need to compile native code for each update cycle. This is also true for low-level bit level code that decodes Eddystone frames. This means you can develop both the user interface for the app and the low-level BLE-related code in the same fast and efficient way, and see it it works on all your phones and tablets directly.

Broadcasting sensor data sidesteps the need to connect

Many of the existing BLE devices out there with built-in sensors require a mobile app to connect to it to access sensor data. Connecting to a generic BLE device has its drawbacks; only one mobile phone may connect at a time to a given BLE device. This limits usability in use cases where several users need to access sensor data simultaneously.

Broadcasting data, like beacons do, solves this problem. Listening to BLE broadcasts (announcements) is connectionless. It requires little coding in the app, and as many phones as are around can intercept broadcasted data packets.

It should be noted that in order to write data to a BLE device, you need to connect to it. In the case of monitoring sensor data, this is however rarely needed.

Innovating the use of broadcasted data has tremendous potential for a wide range of businesses. We have not yet even seen the start of this evolution. Prepare for exciting times in the development of mobile apps for IoT.

Detecting things that move

A beacon can either have a fixed location or get around when attached to a moving object, person or animal. The traditional beacon use case is that a beacon has a fixed location, and users move and get notified when nearby a beacon. For sensors that are stationary in nature, this is a likely scenario.

They may also be cases where beacons are attached to objects that move. It can be goods, vehicles, animals, or people. A user nearby a moving beacon can then be alerted when the object in question is within range.

One can also make applications where users mobile phones are used for locating people. With stationary beacons, the position is known. An app coming into range of a beacon can report this to a cloud service (over WiFi or 3G/4G). The cloud service can then notify other users and systems that this user is now present in the proximity of the beacon.

Example business cases

Here are some monetizable use cases based on beacons that broadcast sensor data:

Greenhouse farming. A mobile application is used to monitor data critical to plant growth, e.g. temperature, humidity, light. In this scenario, beacon-capable sensors are placed on-site next to plant cultures. Bluetooth Low Energy works where there is no network connectivity, this is useful for farming sites located off the grid or in third world countries with unreliable or absent infrastructure. Solar energy can be used to change both sensor equipment and mobile phones.

Tracking and monitoring cattle. Sensors can monitor things like the animal’s body temperature and heart beat rate. Farmers can also detect missing animals and be alerted when a beacon goes out of range. Mobile apps that scan for BLE devices work “in the wild”, where there is no 3G/4G or WiFi available. It is a cheap technology that works independently of the availability of network services.

Monitoring production lines. It is cheap to use BLE-equipped sensors and mobile apps to monitor production lines, for instance growth of bacteria in production of medical drugs. IoT companies can provide customizable solutions based on standard components. Apps developed in HTML/JavaScript can easily be adapted for customers and different market segments at low cost.

Below is a screenshot from an example app created with Evothings Studio that monitors cultivation sensor data:

EcoPlant

Takeaways for Evothings Studio and Eddystone

Summary of takeaways:

  • Eddystone is an extensible standard – sensor data can be added to the protocol, allowing real-time monitoring of many sensor beacons without actually connecting to each one.
    BLE-equipped sensor devices can act as a beacons and transmit sensor data to mobile apps that are listening.
  • Broadcasted data is a cost-effective and efficient way to communicate with users within the range of the beacon. Data from different makes and models can be collected in a uniform way.
    Any number of users/mobile phones can listen to Eddystone broadcasts – no need to connect to a device to read its sensors.
  • Users do not have to connect or configure anything to start receiving sensor data – they just open the app and nearby sensors will be available right away.
  • Apps can also scan or sensor data when in the background – users can be alerted at critical events.
  • Bluetooth Low Energy and mobile apps work independently of electricity and networking infrastructure, for long periods of time without an electric grid.
  • It is easy and fast to implement a mobile app that listens to Eddystone broadcasts. You don’t need specific skills in Android java or iOS programming.
  • Evothings provides extensible libraries that allow developers to create apps using HTML and JavaScript.
    Custom Eddystone frame types can be added to the Evothings Eddystone library – no native coding required.

Learn more about Eddystone

Here is more to read about Eddystone:

For IoT hackers: Prototyping Eddystone devices and developing mobile apps

To develop a mobile app, use the example app “Eddystone Scan” that comes with Evothings Studio 2.0 as a starting point. The code of interest if you want to customise the Eddystone library is in file eddystone.js.

If you don’t have any beacons to experiment with, you can use an Eddystone emulator on your computer, for example: github.com/don/node-eddystone-beacon. Since the code is open-source you can experiment with adding custom frame types to the code.

You can also use ARM mbed to develop Eddystone devices on for example the Nordic Semiconductor nRF51 BLE board.

If you are a BLE firmware developer (or use boards like the Nordic Semiconductor nRF51), check out the protocol specification for Eddystone frame formats, then get ready to get your hands dirty and experiment with your own frame types.

Download Evothings Studio – it just takes 5 minutes to get started

  1. Download Evothings Studio
  2. Install Evothings Viewer on your phone – available on Apple App Store and Google Play Store
  3. Run the “Eddystone Scan” example app that comes with the Evothings Studio download. It will detect any nearby Eddystone beacons right away.

It takes less than 5 minutes to get started!


Using the Particle Photon as a wireless game controller

$
0
0

Two years ago, Particle (formerly Spark) launched their Kickstarter campaign to fund the development of the Core. It proved to be a fine idea, they got funding and since then we’ve seen a lot of things have changed in the IoT-market. Now the Particle Photon is out, a sequel to the Core, and its a bigger brother in several ways:

photon

The major changes includes a shiny 120 MHz (STM32F205) microprocessor and a 6x RAM from to 128 KB. The development board is also equipped with both DAC and CAN, features fairly unusual to this kind of development board.

In short, the team behind Particle has developed an impressive and thought-out offering, where their development environment is mainly cloud based (Particle Build) although more experienced users may choose to opt-in for the “offline” Particle Dev software.

SparkFun were kind enough to provide us with a Photon and a couple of their shields. Among the shields I got my hands on were the Battery Shield and the IMU Shield. Ever since I begun developing examples at Evothings, I have thought that it would be interesting to try to add connectivity and external controls to an already existing, and normally stand-alone mobile application using Evothings Studio. Enter the pacman-canvas application developed by Platzh1rsch. This entire tutorial is based on work from this repository. The pacman-canvas application is a Pacman game implemented using nothing but HTML5, the application can be tested here. Since it is pure HTML5 it is possible to run the application on a mobile phone using the Evothings Studio. In this tutorial we will show you how you can edit the application to connect to a Photon equipped with an IMU shield that will be used to control Pacman himself in the game.

In order to follow this tutorial you need a Particle Photon and a smart phone (iOS/Android). I will also assume that you are familiar with Particle Build. If you’re not, Particle provide an excellent introduction on their web page to bring you up to speed.

Source Code

You can browse the source code for this tutorial in the Evothings GitHub repository.

Also you will find Platzh1rsch pacman-canvas in its original version in his GitHub repository.

What You Need

In order to follow this tutorial I assume that you are familiar with Particle Build. You need the following hardware:

  • An iOS or Android smartphone
  • A local network with a DHCP server

Step 1 – Hardware

The most simple step in this tutorial, is hardware preparation. Simply put the shields together and you are ready to start coding. I prefer to put the stack in a breadboard to minimize the risk of unintentionally shortcut any of the exposed pins. You can view my stack in the picture below.

Particle Photon

Step 2 – Embedded Software

Preparation

As stated earlier I assume that you are familiar with Build so we dive directly into the source code.

Source Code

The complete source code described below can be found on the Github repository.

What you are trying to do is to build a small server on the Photon. When a client connects to the device it will start to send a simple JSON-object containing acceleration data to the connected client. You will also implement a functionality that prints the network information over a serial connection. In this tutorial I will assume that the Photon and the smartphone are connected to the same local network. But it is just a matter of networking skills to make this example usable over the internet if one is interested in that.

// This #include statement was automatically added by the Particle IDE.
#include "SparkFunLSM9DS1/SparkFunLSM9DS1.h"

// Configure SparkFun Photon IMU Shield
#define LSM9DS1_M 0x1E 
#define LSM9DS1_AG  0x6B

// Configure example
#define DELAY_BETWEEN_TRANSFERS 50 // In milliseconds
#define SERVER_PORT 23
#define RGB_BRIGHTNESS 128
#define RGB_R 0
#define RGB_G 255
#define RGB_B 0
 
TCPServer server = TCPServer(SERVER_PORT);
TCPClient client;
LSM9DS1 imu;

void setup()
{

	// Enable serial communication
	Serial.begin(9600);

	// Start listening for clients
	server.begin();
	
	// Initialize SparkFun Photon IMU Shield
	imu.settings.device.commInterface = IMU_MODE_I2C;
	imu.settings.device.mAddress = LSM9DS1_M;
	imu.settings.device.agAddress = LSM9DS1_AG;
	imu.begin();

}

In the code above we have included the SparkFunLSM9DS1 library that allows us to communicate with the SparkFun IMU Shield. Then we configure the shield so that it communicates using the I2C bus. Then there is some defines that can be used to tweak the application to fit your needs. In this case we assume that it is left untouched. The next step is to define the server and client which we use to communicate with clients over the WiFi-connection. And lastly we define the imu which is our interface to the IMU shield.

In setup() we enable the serial communication that we will use to print the received networks settings. Then we start listening for TCP connections by executing the server.begin() method. The IMU shield is initiated after some settings have been updated. Now we have configured and initiated the serial connection, TCP server and the IMU shield.

The following part covers the loop() function that is executed continuously on the development board.

void loop()
{

	if (client.connected()) {

		// Discard data not read by client 
		client.flush();

		// Take control of the led
		RGB.control(true);
		RGB.color(RGB_R, RGB_G, RGB_B);

		// Read IMU data
		imu.readAccel(); 
		
		// Create JSON-object 
		char buffer [40]; 
		size_t length = sprintf(buffer, 
					"{\"ax\": %.3f, \"ay\": %.3f, \"az\": %.3f}\n",
					imu.calcAccel(imu.ax),
					imu.calcAccel(imu.ay),
					imu.calcAccel(imu.az)
						);   

		// Flash LED     
		RGB.brightness(RGB.brightness() == RGB_BRIGHTNESS ? 0 : RGB_BRIGHTNESS);

		// Transfer JSON-object
		server.write((uint8_t *)buffer, length);

	} else {

		// Turn on LED and release control of LED
		RGB.brightness(RGB_BRIGHTNESS);
		RGB.control(false);

		// Check if client connected
		client = server.available();
	}

	// Send network information if serial data is received
	if(Serial.available()) {

		Serial.println(WiFi.localIP());
		Serial.println(WiFi.subnetMask());
		Serial.println(WiFi.gatewayIP());
		Serial.println(WiFi.SSID());

		while(Serial.available()) {
			Serial.read();
		}

	}

	delay(DELAY_BETWEEN_TRANSFERS);
}

The application contains mainly of a if-else statement that checks whether or not there is a client connected to the development board. If there is a client present the application starts to discard data not yet read by the client by executing the client.flush() method. Then the software takes control of the RGB LED and configures the color that will be used to signal if there is a client connected to the development board. The imu.readAccel() method updates the internal data structures containing the latest acceleration data fetched from the shield. This method has to be called before calling the method imu.calcAccel() in order to get the latest sampled acceleration. The next step is to define a buffer that we fill with a JSON-object containing the latest acceleration data. Before the buffer is transmitted to the client by calling the server.write() method the RGB LED on the development board is flashed.

If there is not client connected to the Photon the application ensures that the LED is turned on and the control is released. Then the application reviews if there is a client connected.

If the application receives any data from the serial connection, it transmits the network information from the current network connection. This includes the local IP, subnet, gateway and the name of the network. As a last action the implementation reads and discards all data received from the serial connection.

As a last step, compile and flash the software onto your development board and open a serial connection to the board, send any data to the board and note the network information. Note the IP of the board, we will use that later in the mobile application.

Step 3 – Mobile application

Source Code

The complete source code described below can be found on the Github repository.

Fetch the pacman-canvas commit 3c24c7a995b1e329c5849ac24421eedfc1d70474 from GitHub. Unzip the application and open the files index.htm and pacman-canvas.js in an editor of your choice. These are the only two files that we have to edit in order to achieve our goals. Let’s begin with index.htm.

The first thing that we will do is to remove some code snippets that we don’t need. Remove the code from the following sections:

  • Google Analytics
  • Google Adsense
  • Highscore

The reason that we remove the Highscore section is that it is implemented using a PHP. A technology that is not supported when executing the application in a mobile context. Locate the <div> with the id menu-button and delete the following code:

<li class="button" id="highscore">Highscore</li>

The next step is to add the code snippet below just before the </head> tag.

<script>
	// Redirect console.log to Evothings Workbench.
	if (window.hyper && window.hyper.log) { 
		console.log = hyper.log
	}
</script>

<script src="cordova.js"></script>

This code redirects every console.log() calls to Evothings Workbench making it possible to debug applications remotely. The rest of the snipped includes a required Cordova specific library.

The last step is to add two buttons to the main menu. One that we will use to connect to the Photon and one that will take us back to the start screen. Locate the <div> with the id menu-control and add the following code:

<li class="button" id="photon-control">Connect Photon</li>
<li class="button" id="evothings-back">Back</li>

There is nothing left to edit in the index.htm at this point so we are moving on to editing the pacman-canvas.js file. As a first step we are going to create an object that will represent the Photon. At the top of the file, right after the declaration of the global variables add the following code snippet, don’t forget to add the IP of the Photon:

// Photon handler
function Photon(){

  // IPAddress and port of the Photon
  var IPAddress = 'YOUR IP HERE';
  var port = 23;

  var socketId;
  var previousDirection = {};

  this.connect = function() {

    // Return immediately if there is a connection present
    if(socketId){
      return;
    }

    chrome.sockets.tcp.create(function(createInfo) {

      socketId = createInfo.socketId;

      chrome.sockets.tcp.connect(
        socketId,
        IPAddress,
        port,
        connectedCallback)
      });
    };

  this.disconnect = function () {

    chrome.sockets.tcp.close(socketId, function() {
      
      socketId = 0;

      // Reset graphics and callbacks associated 
      // with the button in the main menu

      $('#' + previousDirection.name).css('background-color', '');

      $('#photon-control').text('Connect Photon');

      $(document).off('click','.button#photon-control')
                 .on('click','.button#photon-control',
                 function(event) {
                  photon.connect()
                });
              })
  };

  function connectedCallback(result){

    if (result === 0) {

      // Update graphics and callbacks associated with the
      // button in the main menu 
      $('#photon-control').text('Disconnect');

      $(document).off('click','.button#photon-control')
                 .on('click','.button#photon-control',
                 function(event) {
                  photon.disconnect();
                 });

      // Set callback to handle received data
      chrome.sockets.tcp.onReceive.addListener(receiveData);

    }
    else {

      alert('Failed to connect to Photon! Try again!', function() {});
    }
  };

  // Function to handle received data. 
  function receiveData(info) {

    // Convert buffer to string containing the sent JSON-object
    var jsonString = String.fromCharCode.apply(null, new Uint8Array(info.data));

    // Try to convert the string to an actual JavaScript object 
    try {
      var jsonObject = JSON.parse(jsonString);
    }
    catch(e) {
      return; 
    }

    var ax = jsonObject['ax'];
    var ay = jsonObject['ay'];
    var az = jsonObject['az'];

    // Adjust pacman direction depending on received acceleration 
    if(Math.abs(ax) > Math.abs(ay)) {

      if(ax < 0) {
        adjustPacmanDirection(down);
      }
      else {
        adjustPacmanDirection(up);
      }

    }
    else if (Math.abs(ay) > Math.abs(ax)) {

      if(ay < 0) {

        adjustPacmanDirection(left);
      }
      else {

        adjustPacmanDirection(right);
      }
    } 
  };

  function adjustPacmanDirection(direction){

    pacman.directionWatcher.set(direction);

    $('#' + direction.name).css('background-color', '#008000');
      
    if(!direction.equals(previousDirection)) {
      $('#' + previousDirection.name).css('background-color', '');
      previousDirection = direction;
    }
  };
};

Note that you have to update the code snippet above with the ipaddress of your Photon.

The method connect() tries to connect to the development board using the defined IPAddress and port. This method is basically a wrapper for the chrome.sockets.tcp.create() that ensures that the application only is connected to one development board. The callback connectedCallback() is executed asynchronously when there is a result of the connection attempt. The callback evaluates if connection succeeded or not. If there is an established connection (result === 0) the main menu is updated to handle a disconnect and a callback to handle (receiveData()) is configured. If the connection failed for some reason a alert dialog is showed on the screen and the socketId is set to zero to enable new connection attempts.

The receiveData() function is executed each time there is new data received. The first step is to convert the received data to a JavaScript object from which data can be extracted. The second step is to adjust the direction of the pacman depending on the values of the acceleration.

The function adjustPacmanDirection() changes the direction of the pacman figure in the game as well as updates the arrows to show which direction the figure is heading at the moment.

Declare a variable named photon among the global variables by adding the following code on the line below the declaration of the variable mapConfig.

var photon;

The next step is to define the variable photon. This is done by adding the following code just below the definition of the game variable (game = new Game()).

photon = new Photon(); 

The next and final step is to add initial callbacks to the buttons we added to index.html earlier in this tutorial. This has to be done in the $(document).ready() method since the DOM has to be fully loaded before we can add the callbacks. Locate the method and add the following code where the other buttons callbacks are defined.

$(document).on('click','.button#photon-control',function(event) {
  photon.connect()
});

$(document).on('click','.button#evothings-back',function(event) {
  history.back()
});

If you followed each and every step of the tutorial you should now be able to connect to your Photon and use it as a game controller for the game.

pacman_2

Summary

It was a pure pleasure to work with the Particle Photon. The team has truly succeeded in developing an great cloud based offer that is simple to use. I will definitely use the development board as a base in future projects. Using Evothings Studio I managed to develop the application in no time.

Time for you to start exploring the exciting world of IoT applications. Download the Evothings Studio today and start developing your own IoT applications!

download_button_lesser

Fun holiday project! How to make a sensor enabled mobile app together with your kids

$
0
0

Holiday Garden Gnomes
[Image credits]

Holidays make for a great time being together tinkering and crafting. What about creating a mobile app together with your children as a holiday or weekend project? In en evening you can be up and running with your very own app – created with HTML and JavaScript by you and your family!

Making mobile apps is easy using HTML and JavaScript

Evothings Studio is a development tool that makes it easy to create mobile apps for the Internet of Things (IoT) in JavaScript and HTML. In this tutorial you will learn how to get started and modify one of the example apps with a custom design. Fun activity to do with your kids!

There is a massive growth in IoT devices such as programmable beacons, sensors and microcontrollers. Evothings Studio is specifically designed with IoT apps in mind, making it easy to create mobile applications that connect to various IoT devices and sensors.

A mobile phone or tablet is in itself an IoT device, with many sensors built in. In this tutorial we will program the accelerometer, a sensor that tells us how the phone or tablet is tilted.

What Evothings Studio does is that it connects your mobile phone to your computer, making it easy to program and run apps directly on the phone. You can even connect multiple phones at once!

Evothings Studio has two components. The application running on your computer is called Evothings Workbench, and the app running on the mobile phone is the Evothings Viewer.

What you need

  • Computer with Internet connection
  • iPhone/iPad or Android mobile phone or tablet with Internet connection
  • A curious mindset
  • One or more kids (actually, you don’t need any kids to enjoy this tutorial!)

Get Evothings Workbench installed on your computer

Click the download for your computer:

When the download is complete, you have a zip-file that contains Evothings Workbench. On most computers it works to unpack it just by by double clicking the zip-file. Place the unzipped folder for example on the desktop or in your user home folder.

The unzipped folder contains the Evothings Workbench application. Double click it to launch it. If you have a Mac computer, right click and select Open first time.

To uninstall Evothings Workbench, just delete the unzipped folder.

If the computer asks for permission to let Evothings Workbench access the Internet, click Yes/OK/Allow depending on your operating system.

Congratulations, you are already halfway to having your first mobile app up and running!

Get the Evothings Viewer mobile app

Next step is to install Evothings Viewer on your mobile phone or tablet. Get it from Apple App Store or Google Play for Android.

Connect from Evothings Viewer to the Workbench

Now it is time to connect from the mobile app to Evothings Workbench and run your first app!

Launch the Workbench on your computer if you have not done so already.

Select the CONNECT tab and click the GET KEY button. This will give you a key code that you enter in the Viewer.

WorkbenchConnectButton

In the Viewer app, enter the key code you got in the Workbench and tap the CONNECT button.

ViewerKeyCode

If you see the connected screen you are ready to run your first app!

ViewerConnected

Running your first app

In the Workbench window, select the EXAMPLES tab, and click RUN on the “Hello World” example app.

WorkbenchExamplesRunHelloWorld

If everything goes well, the app should load on your mobile phone.

Making your own sensor-enabled app

Evothings Studio comes with several example apps, and one of them is “Cordova Accelerometer”. With this app, you can control a graphic sprite on the screen by tilting the mobile phone in different ways. (If you have not heard of “Cordova”, have no fear. It is a programming system for mobile apps, but you don’t have to know anything about Cordova to successfully complete this tutorial.)

We will use the “Cordova Accelerometer” example as a starting point for our own app. What we will do is replacing the screen contents with our own text and graphics.

Start out by clicking COPY on the “Cordova Accelerometer” example app.

WorkbenchExamplesCopyCordovaAccelerometer

Click the CREATE button in the dialog. This will give you a copy of the app under the MY APPS tab.

WorkbenchCopyAppDialogCreate

Now click RUN on the “Cordova Accelerometer” app under the MY APPS tab.

WorkbenchMyAppsRunMyApp

Tilt the phone to move the Evothings logo around the screen.

Editing the code

To find the source code for the app, click the CODE button in the Workbench.

WorkbenchMyAppsCodeMyApp

The file we will be editing is index.html.

Open index.html in a text editor, such as Notepad on Windows or TextEdit on OS X. If you have programmed before, you can use your favourite text editor.

Don’t use a word processor, because they format the text, and program code must contain plain text only.

Make sure when you save your edits that you save the file as plain text (if you feel clueless to what this means, just save and see if it works, it usually works well with Notepad and TextEdit).

As a first test we will alter the heading of the app. Change these lines:

<title>Cordova Accelerometer</tite>
<h1>Cordova Accelerometer</h1>

To this:

<title>My First App</title>
<h1>My First App</h1>

The name in the title tag is shown under MY APPS, and the name in the h1 tag is shown on the mobile screen.

Press the SAVE button in the text editor (usually CTRL+S on Windows and CMD+S on OS X) and you should see the app reloading on your mobile phone, displaying your new heading!

EditorTitleTag

EditorH1Tag

Cleaning up the screen

What we will do now is to delete the text from the screen, leaving only the sprite and the “Vibrate” button.

Begin by deleting this line:

@import 'ui/css/evothings-app.css';

Then change this part of the code:

<header>
    <button class="back" onclick="history.back()">
        <img src="ui/images/arrow-left.svg" />
    </button>

    <img class="logotype" src="ui/images/logo.svg" 
alt="Evothings" />

    <!--<button class="menu" onclick=""><img src="ui/images/menu.svg" /></button>-->
</header>

<h1>My First App</h1>

<p>This app demonstrates the built-in accelerometer of the mobile device.</p>
<p>Tilt the mobile device you are holding to move the Evothings Logo.</p>
<p>When at the edge, the device vibrates if vibration is supported (iPad has no vibration for example).</p>

<img id="sprite" src="sprite.svg" />

<button class="blue" onclick="showVibrationDialog()">
    Vibration Setting
</button>

To this:

<h1>My First App</h1>

<img id="sprite" src="sprite.svg" />

<button onclick="showVibrationDialog()">
    Vibration Setting
</button>

Save and have a look at the result to check that your changes work.

Creating your own sprite

To create your own sprite, get an image in .png format to use in place of the Evothings logotype. You can draw your own image, or get an image form the Internet (just remember that if you want to distribute the app, you must have the rights to use the image).

Place your .png file in the same folder as index.html.

Assuming the image file is named myimage.png, edit index.html by changing this line:

<img id="sprite" src="sprite.svg" />

To this:

<img id="sprite" src="myimage.png" />

Now save and have a look at the result!

Do you want to change the size of the image? For example making it smaller?

Change this line:

width: 300px;

To this:

width: 150px;

EditorCSSWidth

Another fun thing to do is changing the background color.

Add this inside the “<style>” section of the code:

body
{
    background: rgb(255,100,100);
}

EditorCSSBody

This will produce a light-red colour. Change the RGB (red, green, blue) colour values to get a different colour. The colour values should be numbers between 0 and 255. Try rgb(0,255,0) What color do you get? Can you make the background blue?

Some more programming

The sprite is controlled by a sensor called an “accelerometer”. This sensor is built into the phone and tells you the direction of the gravity. When you hold the mobile phone, the gravity points to the center of the Earth, and the X and Y component of the gravity is what we use to move the sprite. When these are zero, the sprite does not move at all.

To change how sensitive the image is, edit this code:

function accelerometerHandler(accelerationX, accelerationY)
{
    var dx = accelerationX * -10
    var dy = accelerationY * -10
    moveSprite(dx, dy)
    vibrateOnEdgeCollision()
}

Change -10 to some other value, for example -5 to make the image move half as fast, or -20 to move twice as fast. You can also use different numbers, for X and Y, like -5 and -20. And what happens if you remove the minus sign?

You are the star!

Congratulations! You have completed the tutorial and made a mobile app!

Yes, this is a real mobile app that you would be able to publish on the App Store or Google Play (using the Cordova build tool). It is some work to do this however, but if you are curious read on in the Evothings documentation and tutorials (see links below).

Show your friends what you have done and help them to create their apps!

Learn more

Share this tutorial with your friends

Pass on the word! Share this tutorial with other parents, children, hackers and colleagues on your social media sites. Also simply talking to someone still works these days I have been told! ;-)

Evothings + Phoenix = Neato

$
0
0

Göran Krampe started working with us here at Evothings in november and he has made an interesting article about how to step-by-step make an Evothings app that communicates with a Phoenix backend using the publish/subscribe pattern with Phoenix channels. Phoenix is the new extremely scalable web server framework for the fast growing Elixir language.

This is a very interesting alternative to using an MQTT broker – which just happens to be the subject of an upcoming article Göran is working on :)

Evothings does MQTT with IBM Bluemix

$
0
0

This article will show how you can connect your Evothings app to the IBM Bluemix cloud via their MQTT service called IoT Foundation. But let’s start with some background information around protocols. If you are impatient you can skip down!

Protocols overview

In the IoT world there are a few messaging protocols competing for attention, mainly XMPP, CoAP, MQTT, DDS and AMQP. These five different protocols all have their own strengths and weaknesses.

  • AMQP is an advanced message queue protocol and is focused on work queues with transactions. It is typically used between servers and not as data collection protocols with the actual IoT devices.
  • XMPP is most often used in end user applications for accessing systems, it’s not designed for constrained networking.
  • DDS is a data centric broker-less advanced protocol used a lot in the industry for distributed real-time mission critical communication, especially for control. It’s also not specifically designed for many small IoT devices in constrained networks, but rather well defined connected systems with several interconnected parts.
  • CoAP and MQTT both deal with actual communication with the IoT devices, typically for collecting data. This means they are both designed to be efficient in low power constrained networking environments.

CoAP is a new IETF standard document transfer protocol that was designed for use with very simple electronic devices, allowing them to interoperate with HTTP. The idea behind CoAP is basically to map HTTP concepts to a binary efficient representation, and run it over UDP to support constrained networks and devices. CoAP also supports multicast and encryption with DTLS (based on TLS). Using HTTP-CoAP gateways IoT devices can be used with REST patterns through regular HTTP making resources available under a URI and accessible through HTTP methods such as GET, PUT, POST, and DELETE.

Before going into MQTT (also see wikipedia) – of the above five protocols CoAP and MQTT are typically the two most interesting protocols to support in Evothings mobile apps and they are complimentary since they differ quite a lot.

This article shows how to get going with MQTT today in Evothings apps using the IBM Bluemix. IoT Foundation service.

Why MQTT is important

MQTT (Message Queue Telemetry Transport) was created 15 years ago and is fairly mature and very simple. It’s a binary publish/subscribe brokered protocol running over TCP using standard SSL for encryption.

mqttorg

There is also a variant of MQTT called MQTT-SN (MQTT For Sensor Networks) which is even more lightweight and can operate over UDP or other network standards like ZigBee. MQTT is an OASIS standard as of version 3.1.1 released in 2014. MQTT-SN is currently not an established standard, but you can read an interesting article here.

MQTT hits a “sweet spot” of being simple while still having a reasonable feature set and working well in constrained networking environments. In combination with MQTT-SN for really constrained devices it fits IoT scenarios really well.

There has however been critique published, and subsequently rebuttals have been made.

From my limited perspective the conclusion is that yes, MQTT is simple, but that is by design. And yes, it can still be improved, perhaps most importantly in error handling and QoS 1/2 functionality. But fact remains – MQTT is the leading open IoT protocol at the moment.

Two client libraries

From Javascript there are two good MQTT client implementations in JavaScript, the “Paho” library which lives in the Eclipse Paho project and MQTT.js hosted and maintained on github. Both support the latest version of MQTT 3.1.1 and thus also websockets.

In this article we are using the Paho library and will probably try the other library in the followup article.

The Painter Example

Our example app is a trivial interactive painting app where the paint operations are shared over MQTT. Each user connects and subscribes to the same topic – and when a user is painting with the finger on the canvas we publish a simple JSON payload describing a painted line from (x,y) to (x2,y2) in a specific color, for each touchmove event.

When the app receives messages it paints them. Note that even the original painter will only paint the line when it’s received as a message. This means as the active painter you get a sense of the latency involved.

The app uses the MQTT server at IBM Internet Of Things Foundation, or IBM IoTF for short. This is IBM’s backbone for their IoT cloud services and it is included in IBM’s Bluemix platform. Below I will describe how you can sign up and get a 30 day test period for free without entering any credit card.

Get a Bluemix Account

After 2016-02-08 when my Bluemix trial account expires :) you will not be able to run the app out-of-the-box against my account. So if you want to try this with your own Bluemix or if it’s after february, you need to do the following to get signed up with Bluemix and get an MQTT service up and running:

  1. Sign up at here.
  2. Then confirm your email by clicking the link in the email you got, and if you end up on the Bluemix login page your email address is your login id.
  3. When logged in, click “CATALOG” in the top menu. Then check the “Internet of Things” checkbox under Services in the left side filter. Then click on Internet of Things Foundation.
  4. With the Free plan selected etc, just press “CREATE” on the far right side. This will create the IoTF service for you and when you get back to the dashboard you should see 1 service.
  5. Now, clicking on the IoTF service you find some “steps” to follow, click the “Launch dashboard” box. Now you have arrived at the dashboard for the MQTT service itself.
  6. At the top you can see “Organization ID: xxxx”. Take note of that id, we will enter it into the application below.
  7. Now we need to generate an API key. Click on the “Access” tab. Then click on “API Keys”. Finally click on “Generate API Key” shown at the bottom. Copy the information shown and save it somewhere.

Ok, now we have a user and password we can use in our mobile app to connect to the MQTT server.

Running the app

Now let’s get our app running. I presume you have gotten started with Evothings Workbench, then you just need to:

  1. Git clone the example to your laptop, or just download and unzip it.
  2. Drag the index.html file and drop it onto the “My Apps” tab in the Workbench. It should then be added to the list.

Before starting the app you need to edit it and enter proper credentials if you signed up for your own Bluemix account above. Open the app.js file and modify lines 10-13 in app.js with the correct settings:

var orgId = '8q7k23'; // This is your organization id in your Bluemix account
var userName = 'a-8q7k23-r3eqe3bunc'; // This is the API key from the IoTF service
var password = 'W5UlH3X7-)v8F-ngD7';  // This is the authentication token for the key

Now if your Workbench is connected with one or more devices you should be able to just click the RUN button to get the app started.

The app should hopefully say “Connected!” on your device. Feel free to use the “CONNECT” button to reconnect if you are disconnected, at the moment the example does not try to keep the connection alive via ping. You can also open up the Javascript Workbench using the top right Tools-button. This enables you to see what the app is logging when its running.

Now start painting with your finger, each connected device should paint its own color.

painter1

Conclusion

Using MQTT is easy and the publish/subscribe model often fits IoT scenarios nicely. When running with the Paho library against Bluemix I have noticed it sometimes fails to connect giving the error message AMQJS0008I Socket closed. I suspect that it may be a timing issue in the secure websocket handshake.

Hope you found this article interesting, download Evothings Studio today and start hacking your own MQTT powered app!

download_button_lesser

Evothings does MQTT with VerneMQ or EMQTT

$
0
0

This is a followup article to Evothings does MQTT with Bluemix. This time we make the same application run but using our own MQTT server, showing how to get going with both VerneMQ and EMQTT using a real certificate from Letsencrypt.org.

I also tried to use MQTT.js instead of the Paho library we used previously but I eventually gave up, for some reason I just couldn’t get it to connect. The Paho library is also about 5x smaller even when minified and the MQTT.js library is primarily written for NodeJS, so… at this point I conclude the Paho library is preferred.

Read the previous article first if you want some background information on MQTT, including an IoT protocol overview, I also found this presentation highly informative and a good summary.

Letsencrypt

In order to set up our own server talking MQTT over Secure Websockets we need a proper SSL certificate, not a self signed one. One can get one for free these days, for example from StartSSL, but the hip and cool way is to use the up and coming Letsencrypt service!

hip-and-cool

The official documentation probably works fine, but I ended up following this howto and it was really simple. I did this on my Debian 8 server:

First stop anything running on port 80 and 443. Then get letsencrypt:

git clone https://github.com/letsencrypt/letsencrypt
cd letsencrypt

Next we let it create our cert, it will ask us some questions, most importanyly the ip name(s) we want the cert to be valid for and you can enter more than one name for your domain separated by spaces:

./letsencrypt-auto certonly

When all is done you should find your certificate files in /etc/letsencrypt/live/your.domain.com/. The certificate is good for 90 days, see instructions in the documentation for renewals.

Finally an easy and free way to get a proper cert!

Your own MQTT server

So we want to use our own MQTT server. There are plenty around and I ended up trying out both VerneMQ and EMQTT. They are very easy to get started with and both are written in Erlang which in my book means they are probably robust choices.

Let’s try EMQTT first. If you prefer VerneMQ, skip down to the next section, you only need one server :)

EMQTT

Just follow the instructions to download and unzip in your regular user home directory. Now we need to configure it for secure websockets using our new certificate. Bring up the file emqttd/etc/emqtt.config in your favorite editor and edit it as follows:

  1. In the Listeners-section I skipped regular http support by simply removing the whole existing “HTTP and WebSocket Listener” section.
  2. Then after finding out how to use separate cert and chain files I uncommented the “WebSocket over HTTPS Listener” section and changed the certificate settings to look like this:
    [
       {certfile, "etc/ssl/cert.pem"},
       {keyfile,  "etc/ssl/privkey.pem"},
       {cacertfile, "etc/ssl/chain.pem"}
    ]
    
  3. Save it!

The reason for this is that using only the fullchain/privkey files caused iOS to have connection issues although Android handled it just fine…

do not ask me

Finish it by copying the certificate files (not sure why it didn’t work for me first with original paths):

cd emqttd/etc/ssl
cp /etc/letsencrypt/live/your.domain.name/*.pem .

If all is well you can now start EMQTT by doing ./bin/emqttd start. It should reply with “emqttd is started successfully!”.

Then you can use ./bin/emqttd_ctl --help to find information on how to list clients, list topics, trace topics etc. Quite neat! To see that you have proper listeners running you can check with ./bin/emqttd_ctl listeners and of course you can also see the ports open by running sudo netstat -plnt or similar.

Note that out-of-the-box the configuration allows anonymous connections, which is just fine for our testing of course.

VerneMQ

Installing VerneMQ was also quite easy following the instructions. Then its time for configuration but if that page looks daunting this is what I ended up changing:

  1. Turn on anonymous access:
    allow_anonymous = on
    
  2. Add a listener for secure web sockets:
    listener.wss.default = 0.0.0.0:8083
    
  3. Add Letsencrypt certificate files:
    listener.wss.cafile = /etc/vernemq/fullchain.pem
    listener.wss.certfile = /etc/vernemq/cert.pem
    listener.wss.keyfile = /etc/vernemq/privkey.pem
    
  4. Finally it turns out we need to increase a limit to cater for the longer iOS device UUIDs:
    max_client_id_size = 40
    

Then we copy the cert files into /etc/vernemq/ and change owner to vernemq. Using the original paths may work if you fix permissions so that vernemq can read them, but I copied them to be sure:

cp /etc/letsencrypt/live/your.domain.name/*.pem /etc/vernemq/
chown vernemq:vernemq /etc/vernemq/*.pem

…and that should be it! Start it up using the proper service – it will not say anything if it succeeds so check with status afterwards to make sure its up:

root@padme:~# service vernemq start
root@padme:~# service vernemq status
pong
vernemq is running

If you get a warning about ulimit you probably need to raise your limits, but it still works for demo.

ulimits too low

If you want to check if it works, you can try the online demo from HiveMQ. Just enter your hostname, port and check the SSL checkbox, then click Connect. If it connects (you get a green circle instead of a red one) you can also test a bit of publish/subscribe to make sure it works :)

Client library

There are two MQTT client implementations in JavaScript, the “Paho” library which lives in the Eclipse Paho project and MQTT.js hosted and maintained on github. Both support the latest version of MQTT 3.1.1 and thus also websockets.

Previously we used the Paho library but in this article I wanted to try out MQTT.js. After fiddling with browserify (or webpack) and rewriting the example code to fit the API etc – I utterly failed to connect using that library, kinda felt like this guy:

failure

Obviously I must have missed some detail, but I really have no clue why it doesn’t want to connect. So for us, the Paho library is what we use and recommend. Paho is also much smaller and not primarily written for NodeJS, although I haven’t scrutinized the code quality of these two libraries.

The Painter Example

Our example app is the same as before – a trivial interactive painting app where the paint operations are shared over MQTT. Each user connects and subscribes to the same topic – and when a user is painting with the finger on the canvas we publish a simple JSON payload describing a painted line from (x,y) to (x2,y2) in a specific color, for each touchmove event.

When the app receives messages it paints them. Note that even the original painter will only paint the line when it’s received as a message. This means as the active painter you get a sense of the latency involved.

Running the app

Now let’s get our app running. I presume you have gotten started with Evothings Workbench, then you just need to:

  1. Git clone our demo repository to your laptop, or just download and unzip it.
  2. Locate the folder mqtt-vernemq-emqtt-painter and drag the index.html file from there and drop it onto the “My Apps” tab in the Workbench. It should then be added to the list.

Before starting the app you need to edit it and enter your own host and port – you don’t need to change user/password since we do not use it. Open the app.js file and modify lines 10-11 in app.js with the correct settings:

var host = 'padme.krampe.se'; // Change this to your server
var port = 8083; // No need to change this if you configured your server as described

Now if your Workbench is connected with one or more devices you should be able to just click the RUN button to get the app started.

The app should hopefully say “Connected!” on your device. Feel free to use the “CONNECT” button to reconnect if you are disconnected, at the moment the example does not try to keep the connection alive via ping. You can also open up the Javascript Workbench using the top right Tools-button. This enables you to see what the app is logging when its running.

Now start painting with your finger, each connected device should paint its own color.

painter1

Conclusion

MQTT is very easy to get going with and both VerneMQ and EMQTT are easy to install. Letsencrypt also makes it trivial to get your own real certificate which is highly recommended since mobile hybrid apps refuse to deal with self signed certificates. For a whole range of mobile IoT applications MQTT should be a very logical choice for communication, both between devices (mobile or IoT) and between devices and servers.

Hope you found this article interesting, download Evothings Studio today and start hacking your own MQTT powered app!

download_button_lesser

Viewing all 97 articles
Browse latest View live