<?xml version="1.0" encoding="UTF-8"?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns="http://purl.org/rss/1.0/" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel rdf:about="https://community.wolfram.com">
    <title>Community RSS Feed</title>
    <link>https://community.wolfram.com</link>
    <description>RSS Feed for Wolfram Community showing any discussions tagged with Connected Devices sorted by most likes.</description>
    <items>
      <rdf:Seq>
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/418132" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/246929" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/344278" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/789911" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/315748" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/456947" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/196759" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/250923" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/23261" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/1026495" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/615905" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/1057588" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/454226" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/992466" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/822762" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/344241" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/170725" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/453169" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/505425" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/1153218" />
      </rdf:Seq>
    </items>
  </channel>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/418132">
    <title>Free Wolfram Language on Raspberry Pi tutorial</title>
    <link>https://community.wolfram.com/groups/-/m/t/418132</link>
    <description>*NOTE: the main tutorial notebook is attached at the end of this post and [can be downloaded by clicking here][1].*&#xD;
&#xD;
&#xD;
----------&#xD;
&#xD;
&#xD;
I wanted to share the attached *Mathematica* notebook that I created for teaching kids (ages 9-14) about the Wolfram Language on the Raspberry Pi. It has a simplified (and colorful) interface for students and easy editing tools for teachers to create new content (even those with little or no experience using *Mathematica*). I am extremely grateful for the efforts of Anna Musser who very patiently helped me refine the interface over many iterations and piloted the first workshops using this notebook at Empow Studios!&#xD;
&#xD;
It includes a self-paced tutorial designed for beginning programmers who are young or young-at-heart. It also includes instructions for authoring your own tutorials. The interface is minimally dynamic so the tutorial will run as smooth as possible on the Raspberry Pi model B; if there is interest, then we could build a prettier dynamic interface for more powerful hardware. Please comment below with any improvements/changes that you would like to see and of course please comment or upvote if you find this useful or interesting :)&#xD;
&#xD;
&#xD;
----------&#xD;
## Sample of the attached tutorial:&#xD;
&#xD;
&#xD;
&#xD;
![enter image description here][2]&#xD;
&#xD;
**COMPLETE TUTORIAL NOTEBOOK ATTACHED BELOW**&#xD;
&#xD;
&#xD;
  [1]: https://www.dropbox.com/s/mddbex45h7ynao3/FirstCourseOnRPI.nb?dl=1&#xD;
  [2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-12-05at11.46.57AM.png&amp;amp;userId=11733</description>
    <dc:creator>Kyle Keane</dc:creator>
    <dc:date>2015-01-07T18:16:20Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/246929">
    <title>One pixel thermal imaging camera with Mathematica and Arduino</title>
    <link>https://community.wolfram.com/groups/-/m/t/246929</link>
    <description>Triggered by a leak in my hot water boiler at home I built a thermal imaging camera using an Arduino and interfacing it with Mathematica. I tried to make up for the &amp;#034;one-pixel-resolution&amp;#034; by using Mathematica&amp;#039;s powerful image analysis abilities. This is a work in progress and I would be delighted to get some comments/suggestions from the Community. In this project, I had a lot of help from [url=http://community.wolfram.com/web/bschelter/home]Bjoern Schelter[/url], who has recently joined this Community. If you have the components and use the programs below, you should have a &amp;#034;working&amp;#034; one-pixel thermal camera after 30 minutes or so of DIY. Here&amp;#039;s a sneak peek of what we want to get out (this one is a &amp;#034;selfie&amp;#034;):&#xD;
&#xD;
[img=width: 426px; height: 270px;]/c/portal/getImageAttachment?filename=asdwefasdcsdvvafe2345QT.PNG&amp;amp;userId=11733[/img]&#xD;
&#xD;
I use the following components:[list=1]&#xD;
[*]Arduino Uno R3&#xD;
[*][url=http://www.amazon.co.uk/XINTE-MLX90614ESF-DCI-non-contact-Infrared-Temperature/dp/B00IMU0LXG/ref=sr_1_2?ie=UTF8&amp;amp;qid=1399153918&amp;amp;sr=8-2&amp;amp;keywords=Melexis]MELEXIS / MLX90614ESF-DCI / DS Digital non-contact Infrared Temperature Sensor[/url]  (~ £35 and more or less the same in USD)&#xD;
[*][url=http://www.amazon.co.uk/MG995-Servo-Sensor-Mount-Black/dp/B00EZIYCUW/ref=sr_1_3?ie=UTF8&amp;amp;qid=1399153991&amp;amp;sr=8-3&amp;amp;keywords=Pan+tilt]MG995 Servo Sensor Mount Kit 2 DOF Pan and Tilt Black[/url] (~ £24, similar in USD)&#xD;
[*]Two 4.7 kOhm resistors.&#xD;
[*]One 0.1 uF capacitor.&#xD;
[*]One small breadboard.&#xD;
[*]5V power source.&#xD;
[*]Wires. &#xD;
[/list]The idea is illustrated in this [url=http://www.youtube.com/watch?v=rcTKVOzxCmw]Youtube video[/url]. To my best knowledge, the original idea comes from a [url=http://www.theimagingsource.com/en_US/blog/posts/20090622/]project of Steffen Strobel in the German science competition &amp;#034;Jugend Forscht&amp;#034;[/url]. The main idea is to mount a non-contact temperature sensor on a pan and tilt mechanism (i.e. two servos) on a tripod. An Arduino microcontroller is then used to communicate via the serial port with Mathematica, which is used to control the servos and triggers the measurements. After the data acquisition Mathematica cleans the data and produces some thermal images (see below).&#xD;
&#xD;
We use the following wiring diagram to connect the servos and the temperature sensor to the Arduino.&#xD;
[center][img=width: 300px; height: 203px;]/c/portal/getImageAttachment?filename=ThermoCam.jpg&amp;amp;userId=48754[/img][/center]&#xD;
The resistors are 4.7kOhm and the capacitor is 0.1uF. The sensor part is taken from the [url=http://bildr.org/2011/02/mlx90614-arduino/]bildr.blog[/url], which also shows how to make Arduino talk to the sensor. The Melexis sensor that we chose has a temperature resolution of 0.02 degrees Celsius and a rather narrow field of view, which is important for our application. &#xD;
&#xD;
For the servo part, we use the standard servo.h library; an example of its application can be found [url=http://arduino.cc/en/Tutorial/sweep]here[/url].&#xD;
&#xD;
Here is a photo of the sensor/head of the device.&#xD;
[center][img=width: 320px; height: 240px;]/c/portal/getImageAttachment?filename=photo.JPG&amp;amp;userId=48754[/img][/center]&#xD;
The entire device looks like this.&#xD;
[center][img=width: 240px; height: 320px;]/c/portal/getImageAttachment?filename=7033photo4.JPG&amp;amp;userId=48754[/img][/center]&#xD;
The idea is to use Mathematica to send instructions to the servos and to initiate the measurements. To interface Mathematica with Arduino we use the [url=http://library.wolfram.com/infocenter/Demos/5726/]SerialIO package[/url]. I found [url=http://williamjturkel.net/2011/12/25/connecting-arduino-to-mathematica-on-mac-os-x-with-serialio/]this website by William Turkel[/url] very useful to make SerialIO work on my Mac; following the steps and adapting some directories makes the package work without any problems.&#xD;
&#xD;
At that point, we have everything in place, and only need to put the bits together. We first need to upload this piece of code (also attached at the bottom) to the Arduino.&#xD;
[code]#include &amp;lt;i2cmaster.h&amp;gt;&#xD;
#include &amp;lt;Servo.h&amp;gt; &#xD;
&#xD;
//Servo setup&#xD;
int servoPin1 = 9;&#xD;
int servoPin2 = 10; &#xD;
Servo servo1;  &#xD;
Servo servo2;&#xD;
int angle1 = 40;   // servo start positions in degrees &#xD;
int angle2 = 50;&#xD;
&#xD;
&#xD;
//Melexis setup&#xD;
int sensor = 0;&#xD;
int inByte = 0;&#xD;
&#xD;
&#xD;
void setup()&#xD;
{&#xD;
	Serial.begin(9600);&#xD;
	&#xD;
       // attach pan-tilt servos&#xD;
       servo1.attach(servoPin1);&#xD;
       servo2.attach(servoPin2); &#xD;
&#xD;
&#xD;
       servo1.write(angle1);&#xD;
       servo2.write(angle2);&#xD;
&#xD;
&#xD;
	//Initialise the i2c bus&#xD;
	i2c_init(); &#xD;
	PORTC = (1 &amp;lt;&amp;lt; PORTC4) | (1 &amp;lt;&amp;lt; PORTC5);//enable pullups&#xD;
       establishContact();&#xD;
}&#xD;
&#xD;
&#xD;
void loop()&#xD;
{&#xD;
 if (Serial.available() &amp;gt; 0) &#xD;
  {&#xD;
   inByte = Serial.read();&#xD;
   &#xD;
   int dev = 0x5A&amp;lt;&amp;lt;1;&#xD;
   int data_low = 0;&#xD;
   int data_high = 0;&#xD;
   int pec = 0;&#xD;
&#xD;
&#xD;
   i2c_start_wait(dev+I2C_WRITE);&#xD;
   i2c_write(0x07);&#xD;
&#xD;
&#xD;
   // read&#xD;
   i2c_rep_start(dev+I2C_READ);&#xD;
   data_low = i2c_readAck(); //Read 1 byte and then send ack&#xD;
   data_high = i2c_readAck(); //Read 1 byte and then send ack&#xD;
   pec = i2c_readNak();&#xD;
   i2c_stop();&#xD;
&#xD;
&#xD;
   //This converts high and low bytes together and processes temperature, MSB is a error bit and is ignored for temps&#xD;
   double tempFactor = 0.02; // 0.02 degrees per LSB (measurement resolution of the MLX90614)&#xD;
   double tempData = 0x0000; // zero out the data&#xD;
   int frac; // data past the decimal point&#xD;
&#xD;
&#xD;
 // Serial.print(tempData);&#xD;
 // Serial.write(inByte);&#xD;
   // This masks off the error bit of the high byte, then moves it left 8 bits and adds the low byte.&#xD;
   tempData = (double)(((data_high &amp;amp; 0x007F) &amp;lt;&amp;lt; 8) + data_low);&#xD;
   tempData = (tempData * tempFactor)-0.01;&#xD;
&#xD;
&#xD;
   //inByte = (float)(((data_high &amp;amp; 0x007F) &amp;lt;&amp;lt; 8) + data_low);&#xD;
&#xD;
&#xD;
  float celsius = tempData - 273.15;&#xD;
   sensor=(int)(celsius*100);&#xD;
   //float fahrenheit = (celsius*1.8) + 32;&#xD;
&#xD;
  Serial.print(sensor);&#xD;
  &#xD;
   &#xD;
   // horizontal &amp;#034;H&amp;#034;-&amp;gt; 72; reverse &amp;#034;R&amp;#034;-&amp;gt; 82; vertical &amp;#034;V&amp;#034;-&amp;gt; 86; end &amp;#034;E&amp;#034;-&amp;gt; 69&#xD;
   &#xD;
  if(inByte==72)&#xD;
  {&#xD;
   angle1=angle1+1;&#xD;
   servo1.write(angle1);&#xD;
  }&#xD;
   if(inByte==82)&#xD;
  {&#xD;
   angle1=40;&#xD;
   servo1.write(angle1);&#xD;
  }&#xD;
  if(inByte==86)&#xD;
  {&#xD;
   angle2=angle2+1;&#xD;
   servo2.write(angle2);&#xD;
  }&#xD;
    if(inByte==69)&#xD;
  {&#xD;
   angle1 = 40;   // servo back to start&#xD;
   angle2 = 50;&#xD;
   servo1.write(angle1);&#xD;
   servo2.write(angle2);&#xD;
  }&#xD;
  &#xD;
   delay(15); // 15 works; wait 15 milliseconds before printing again&#xD;
 }&#xD;
&#xD;
&#xD;
}&#xD;
&#xD;
&#xD;
&#xD;
void establishContact() &#xD;
{&#xD;
 while (Serial.available() &amp;lt;= 0) &#xD;
 {&#xD;
   Serial.print(&amp;#039;A&amp;#039;);&#xD;
   delay(100);&#xD;
 }&#xD;
}&#xD;
[/code]&#xD;
The idea is to make Mathematica communicate with the Arduino via the serial connection. The Arduino sketch shows that Ardunio is waiting for instructions, e.g. &amp;#034;H&amp;#034; to move horizontally, &amp;#034;V&amp;#034; to move vertically and &amp;#034;E&amp;#034; to go to the end position. &#xD;
[mcode](*First we load the SerialIO package. See instructions above.*)&#xD;
&#xD;
&amp;lt;&amp;lt; SerialIO`&#xD;
&#xD;
(*We test whether Mathematica&amp;#039;s applications folder is in the Path. On some Macs Mathematica will be in the /Library directory - used in this example- and in others in the /Users/username/Library directory, where &amp;#034;username&amp;#034; needs to be replaced by the correct user name.*)&#xD;
&#xD;
MemberQ[$Path, &amp;#034;/Library/Mathematica/Applications&amp;#034;]&#xD;
&#xD;
(*If this gives True all is fine. If it evaluates to False execute&#xD;
AppendTo[$Path, &amp;#034;/Library/Mathematica/Applications&amp;#034;]&#xD;
*)&#xD;
&#xD;
(*Connect to the Arduino*)&#xD;
&#xD;
myArduino = &#xD;
  SerialOpen[Quiet[FileNames[&amp;#034;tty.usb*&amp;#034;, {&amp;#034;/dev&amp;#034;}, Infinity]][[1]]];&#xD;
SerialSetOptions[myArduino, &amp;#034;BaudRate&amp;#034; -&amp;gt; 9600];&#xD;
While[SerialReadyQ[myArduino] == False, Pause[0.1]];&#xD;
&#xD;
(*Data collection, in this case 40 vertical and 70 horizontal pixels; runtime 2-3 minutes; pauses cannot be reduced much further.*)&#xD;
&#xD;
pixels = {}; SerialRead[myArduino]; For[j = 1, j &amp;lt; 41, j++, &#xD;
 For[i = 1, i &amp;lt; 71, i++, SerialWrite[myArduino, &amp;#034;H&amp;#034;]; &#xD;
  AppendTo[pixels, (SerialRead[myArduino] // ToExpression)/100.]; &#xD;
  Pause[0.1]]; SerialWrite[myArduino, &amp;#034;R&amp;#034;]; &#xD;
 SerialWrite[myArduino, &amp;#034;V&amp;#034;]; SerialRead[myArduino]; &#xD;
 Pause[0.1];]; SerialWrite[myArduino, &amp;#034;E&amp;#034;];&#xD;
&#xD;
(*After the data aquisition close the connection to Arduino*)&#xD;
SerialClose[myArduino]&#xD;
&#xD;
(*Now we can use several different ways to represent the data, note that some point at the beginning/end of the scanned lines are removed; there were too many measurements errors just after the &amp;#034;carriage return&amp;#034;*)&#xD;
&#xD;
ArrayPlot[Partition[Reverse[pixels], 70][[All, 2 ;; -10]], &#xD;
 ColorFunction -&amp;gt; &amp;#034;Rainbow&amp;#034;]&#xD;
&#xD;
(*here&amp;#039;s another colour scheme.*)&#xD;
ArrayPlot[Partition[Reverse[pixels], 70][[All, 2 ;; -10]], &#xD;
&#xD;
(*Occasionally there are some outliers in the measurements; here we clean them out.*)&#xD;
ArrayPlot[&#xD;
 Partition[Reverse[pixels /. x_ /; x &amp;gt; 35. -&amp;gt; 35.], 70][[All, &#xD;
   2 ;; -10]], ColorFunction -&amp;gt; &amp;#034;Temperature&amp;#034;]&#xD;
 ColorFunction -&amp;gt; &amp;#034;Temperature&amp;#034;]&#xD;
&#xD;
(*This last one uses interpolation to make the image smoother.*)&#xD;
&#xD;
ListContourPlot[&#xD;
 Partition[Reverse[Log /@ pixels /. x_ /; x &amp;gt; 35. -&amp;gt; 35.], &#xD;
   70][[-1 ;; 1 ;; -1, 1 ;; -10]], AspectRatio -&amp;gt; 0.9, &#xD;
 ColorFunction -&amp;gt; &amp;#034;Rainbow&amp;#034;, PlotRange -&amp;gt; All, &#xD;
 InterpolationOrder -&amp;gt; 2, Contours -&amp;gt; 60, ContourStyle -&amp;gt; None][/mcode][center][/center]So here&amp;#039;s a photo of my broken boiler and its scan:&#xD;
[center][img=width: 518px; height: 257px;]/c/portal/getImageAttachment?filename=BoilerScan.jpg&amp;amp;userId=48754[/img][/center]&#xD;
Because of the scanning procedure (which just looks at the angle and does not use any projection), the scan is slightly distorted, but it is possible to recognize the main features and even the sticker on the front!&#xD;
&#xD;
It appears that this rather primitive device can also be used to analyse electrical components. Here is an image of my MacBook Pro. [center][img=width: 450px; height: 306px;]/c/portal/getImageAttachment?filename=Laptop1.jpg&amp;amp;userId=48754[/img][/center]&#xD;
 The position of the CPU becomes quite obvious.&#xD;
&#xD;
There are many things that need to be improved: &#xD;
&#xD;
(i) First, there is the projection issue. The scanner does scan angles. It needs to be projected to a 2D plane. One might use an ultrasonic distance sensor to get better results.&#xD;
(ii) The device needs to be calibrated.&#xD;
(iii) A user interface is needed. It would be useful to click on the image and get the temperature reading.&#xD;
(iv) The communication between Mathematica and the Arduino need to be improved. The starting position of 40/50 degrees is hard-coded into the Arduino sketch. It should be done by the Mathematica code.&#xD;
(v) We have not even started to use Mathematica&amp;#039;s features on this. Much image processing could be done. The image should be overlayed to a normal photo of the object that is scanned. Manipulate could be used to change thresholds, i.e. the threshold to cut-off outliers, which is currently set to 35 degrees. &#xD;
(vi) The speed might be improved. I suppose that the scanning time of 3 minutes or so is typical for these devices, but one might improve that a bit. Also, Mathematica could use edge-detection to determine regions where a higher scan density would be helpful to get a better resolution. This only makes sense if the servos could be directed to a certain position much more precisely; alternatively, we could use random positions, which then are precisely determined using an accelerometer or so.&#xD;
&#xD;
There is of course much more to do. In spite of this being work in progress, I wanted to share this project, and hope for helpful comments.&#xD;
&#xD;
I attach the Mathematica notebook. I have a movie of the scanning process and the actual arduino sketch which I cannot upload directly. Here are links to the [url=https://www.dropbox.com/s/sm2prb5w0wrqexp/ThermoCamera_Forum.zip]arduino sketch[/url] and the [url=https://www.dropbox.com/s/wpgpww8a7jhjfn0/Scan.MOV]scanning movie[/url].&#xD;
&#xD;
M.</description>
    <dc:creator>Marco Thiel</dc:creator>
    <dc:date>2014-05-04T00:48:33Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/344278">
    <title>Using your smart phone as the ultimate sensor array for Mathematica</title>
    <link>https://community.wolfram.com/groups/-/m/t/344278</link>
    <description>Many fantastic posts in this community describe how to connect external devices to Mathematica and how to read the data. Connecting Mathematica to an Arduino for example allows you to read and then work with data from all kinds of sensors. In most of the cases, when we speak about connected devices, additional hardware is necessary. Smart phones, on the other hand, are our permanent companions and they host a wide array of sensors that we can tap into with Mathematica. For this post, I will be using an iPhone 5 - but a similar approach can be taken with many other smart phones. [Björn Schelter][1] and myself have worked on this together.&#xD;
&#xD;
The first thing we need in order to be able to read the iPhone is a little App which can be purchased on the iTunes App store: it is called [Sensor Data][2]. When you open the app you see a screen like this one. &#xD;
&#xD;
![enter image description here][3]&#xD;
&#xD;
At the top of the screen you see an IP address and a port number (after the colon!). These numbers will be important to connect to the phone and either download data or stream sensor data directly. If you click on the &amp;#034;start capture&amp;#034; the iPhone&amp;#039;s data will be stored on the phone and can be downloaded into Mathematica. In this post we are rather interested in the &amp;#034;Streaming&amp;#034; function. If you click on the respective button on the bottom you get to a screen like this:&#xD;
&#xD;
![enter image description here][4]&#xD;
&#xD;
There you can choose a frequency for the measurements and start the streaming. In fact we also can choose which sensors we want to use with the Config button. &#xD;
&#xD;
![enter image description here][5]&#xD;
&#xD;
The following Mathematica code will work when all (!) sensors are switched on. Now we are ready to connect to the iPhone. Switch the streaming on and execute the following commands:&#xD;
&#xD;
    ClearAll[&amp;#034;Global`*&amp;#034;];&#xD;
    For[i = 1, i &amp;lt; 3, i++, Quiet[InstallJava[]]];&#xD;
    Needs[&amp;#034;JLink`&amp;#034;]&#xD;
&#xD;
and then &#xD;
&#xD;
    LoadJavaClass[&amp;#034;java.util.Arrays&amp;#034;];&#xD;
    packet = JavaNew[&amp;#034;java.net.DatagramPacket&amp;#034;, JavaNew[&amp;#034;[B&amp;#034;, 1024], 1024];&#xD;
    socket = JavaNew[&amp;#034;java.net.DatagramSocket&amp;#034;, 10552];&#xD;
    socket@setSoTimeout[10];&#xD;
    listen[] := If[$Failed =!= Quiet[socket@receive[packet], Java::excptn], &#xD;
    record =JavaNew[&amp;#034;java.lang.String&amp;#034;, java`util`Arrays`copyOfRange @@ &#xD;
    packet /@ {getData[], getOffset[], getLength[]}]@toString[] //&#xD;
    Sow];&#xD;
&#xD;
Next we have to define a ScheduledTask to read the sensors:&#xD;
&#xD;
    RemoveScheduledTask[ScheduledTasks[]];&#xD;
    results = {}; &#xD;
    RunScheduledTask[AppendTo[results, Quiet[Reap[listen[]][[2, 1]]]]; If[Length[results] &amp;gt; 1200, Drop[results, 150]], 0.01];&#xD;
&#xD;
We also need to define a streaming function:&#xD;
&#xD;
    stream := Refresh[ToExpression[StringSplit[#[[1]], &amp;#034;,&amp;#034;]] &amp;amp; /@ Select[results[[-1000 ;;]], Head[#] == List &amp;amp;], UpdateInterval -&amp;gt; 0.01]&#xD;
&#xD;
Alright. Now comes the interesting part. Using &#xD;
&#xD;
    (*Compass*)&#xD;
    While[Length[results] &amp;lt; 1000, Pause[2]]; Dynamic[AngularGauge[Refresh[stream[[-1, 30]], UpdateInterval -&amp;gt; 0.01], {360, 0}, &#xD;
    ScaleDivisions -&amp;gt; None, GaugeLabels -&amp;gt; {Placed[&amp;#034;N&amp;#034;, Top], Placed[&amp;#034;S&amp;#034;, Bottom], Placed[&amp;#034;E&amp;#034;, Right], Placed[&amp;#034;W&amp;#034;, Left]}, ScaleOrigin -&amp;gt; {{5 Pi/2, Pi/2}, 1}, ScalePadding -&amp;gt; All, ImageSize -&amp;gt; Medium], SynchronousUpdating -&amp;gt; False]&#xD;
&#xD;
we can measure the bearing of our iPhone. The resulting compass moves as we move the iPhone:&#xD;
&#xD;
![enter image description here][6]&#xD;
&#xD;
We can also read the (x-,y-,z-) accelerometers&#xD;
&#xD;
    (*Plot Ascelerometers*)&#xD;
    While[Length[results] &amp;lt; 1000, Pause[2]]; Dynamic[Refresh[ListLinePlot[{stream[[All, 2]], stream[[All, 3]], stream[[All, 4]]}, PlotRange -&amp;gt; All], UpdateInterval -&amp;gt; 0.1]]&#xD;
&#xD;
which gives plots like this one:&#xD;
&#xD;
![enter image description here][7]&#xD;
&#xD;
The update is a bit bumpy, because the data is only sent every second or so from the iPhone; the measurements, however, are taken with a frequency of up to 100Hz. We can also represent the FFT of the streamed data like so:&#xD;
&#xD;
    (*Plot FFT of accelorometers*)&#xD;
    While[Length[results] &amp;lt; 1000, &#xD;
     Pause[2]]; Dynamic[&#xD;
     Refresh[ListLinePlot[&#xD;
       Log /@ {Abs[Fourier[Standardize[stream[[All, 2]]]]], &#xD;
         Abs[Fourier[Standardize[stream[[All, 3]]]]], &#xD;
         Abs[Fourier[Standardize[stream[[All, 4]]]]]}, &#xD;
       PlotRange -&amp;gt; {{0, 200}, {-5, 2.5}}, ImageSize -&amp;gt; Large], &#xD;
      UpdateInterval -&amp;gt; 0.1]]&#xD;
&#xD;
Adding a &amp;#034;real time&amp;#034; scale is also quite straight forward:&#xD;
&#xD;
(*Measurements with time scale*)&#xD;
&#xD;
    While[Length[results] &amp;lt; 1000, Pause[2]];&#xD;
    starttime = IntegerPart[stream[[2, 1]]];&#xD;
    Dynamic[Refresh[&#xD;
      ListLinePlot[&#xD;
       Transpose[{(stream[[Max[-300, -Length[stream]] ;;, 1]] - &#xD;
           starttime), stream[[Max[-300, -Length[stream]] ;;, 2]]}], &#xD;
       PlotRange -&amp;gt; All, ImageSize -&amp;gt; Large], UpdateInterval -&amp;gt; 0.01]]&#xD;
&#xD;
Well, then. We can also plot our iPhone&amp;#039;s position in space&#xD;
&#xD;
    (*3d Motion*)&#xD;
    &#xD;
    While[Length[results] &amp;lt; 1000, Pause[2]]; Dynamic[&#xD;
     Refresh[ListLinePlot[{stream[[All, 5]], stream[[All, 6]], &#xD;
        stream[[All, 7]]}, PlotRange -&amp;gt; All], UpdateInterval -&amp;gt; 0.1]]&#xD;
    &#xD;
    While[Length[results] &amp;lt; 1000, Pause[2]]; Dynamic[&#xD;
     Graphics3D[{Black, &#xD;
       Rotate[Rotate[&#xD;
         Rotate[Cuboid[{-2, -1, -0.2}, {2, 1, 0.2}], &#xD;
          stream[[-1, 7]], {0, 0, 1}], -1*stream[[-1, 6]], {0, 1, 0}], &#xD;
        stream[[-1, 5]], {1, 0, 0}]}, &#xD;
      PlotRange -&amp;gt; {{-3, 3}, {-3, 3}, {-3, 3}}, Boxed -&amp;gt; True], &#xD;
     UpdateInterval -&amp;gt; 0.1, SynchronousUpdating -&amp;gt; False]&#xD;
&#xD;
This looks like so:&#xD;
&#xD;
![enter image description here][8]&#xD;
&#xD;
Last but not least we can write a little GUI to access all different sensors. (This does run a bit slow though!)&#xD;
&#xD;
(*GUI all sensors*)&#xD;
&#xD;
    sensororder = {&amp;#034;Timestamp&amp;#034;, &amp;#034;Accel_X&amp;#034;, &amp;#034;Accel_Y&amp;#034;, &amp;#034;Accel_Z&amp;#034;, &amp;#034;Roll&amp;#034;, &#xD;
       &amp;#034;Pitch&amp;#034;, &amp;#034;Yaw&amp;#034;, &amp;#034;Quat.X&amp;#034;, &amp;#034;Quat.Y&amp;#034;, &amp;#034;Quat.Z&amp;#034;, &amp;#034;Quat.W&amp;#034;, &amp;#034;RM11&amp;#034;, &#xD;
       &amp;#034;RM12&amp;#034;, &amp;#034;RM13&amp;#034;, &amp;#034;RM21&amp;#034;, &amp;#034;RM22&amp;#034;, &amp;#034;RM23&amp;#034;, &amp;#034;RM31&amp;#034;, &amp;#034;RM32&amp;#034;, &amp;#034;RM33&amp;#034;, &#xD;
       &amp;#034;GravAcc_X&amp;#034;, &amp;#034;GravAcc_Y&amp;#034;, &amp;#034;GravAcc_Z&amp;#034;, &amp;#034;UserAcc_X&amp;#034;, &amp;#034;UserAcc_Y&amp;#034;, &#xD;
       &amp;#034;UserAcc_Z&amp;#034;, &amp;#034;RotRate_X&amp;#034;, &amp;#034;RotRate_Y&amp;#034;, &amp;#034;RotRate_Z&amp;#034;, &amp;#034;MagHeading&amp;#034;, &#xD;
       &amp;#034;TrueHeading&amp;#034;, &amp;#034;HeadingAccuracy&amp;#034;, &amp;#034;MagX&amp;#034;, &amp;#034;MagY&amp;#034;, &amp;#034;MagZ&amp;#034;, &amp;#034;Lat&amp;#034;, &#xD;
       &amp;#034;Long&amp;#034;, &amp;#034;LocAccuracy&amp;#034;, &amp;#034;Course&amp;#034;, &amp;#034;Speed&amp;#034;, &amp;#034;Altitude&amp;#034;, &#xD;
       &amp;#034;Proximity&amp;#034;};&#xD;
    While[Length[results] &amp;lt; 1000, Pause[2]]; Manipulate[&#xD;
     Dynamic[Refresh[&#xD;
       ListLinePlot[{stream[[All, Position[sensororder, a][[1, 1]]]], &#xD;
         stream[[All, Position[sensororder, b][[1, 1]]]], &#xD;
         stream[[All, Position[sensororder, c][[1, 1]]]]}, &#xD;
        PlotRange -&amp;gt; All, ImageSize -&amp;gt; Full], &#xD;
       UpdateInterval -&amp;gt; 0.01]], {{a, &amp;#034;Accel_X&amp;#034;}, &#xD;
      sensororder}, {{b, &amp;#034;Accel_Y&amp;#034;}, sensororder}, {{c, &amp;#034;Accel_Z&amp;#034;}, &#xD;
      sensororder}, ControlPlacement -&amp;gt; Left, &#xD;
     SynchronousUpdating -&amp;gt; False]&#xD;
&#xD;
This gives a user interface which looks like this:&#xD;
&#xD;
![enter image description here][9]&#xD;
&#xD;
In the drop down menu we can choose three out of all sensors. These are all available sensors:&#xD;
&#xD;
&amp;gt; &amp;#034;Timestamp&amp;#034;, &amp;#034;Accel_X&amp;#034;, &amp;#034;Accel_Y&amp;#034;, &amp;#034;Accel_Z&amp;#034;, &amp;#034;Roll&amp;#034;, &amp;#034;Pitch&amp;#034;, &amp;#034;Yaw&amp;#034;,&#xD;
&amp;gt; &amp;#034;Quat.X&amp;#034;, &amp;#034;Quat.Y&amp;#034;, &amp;#034;Quat.Z&amp;#034;, &amp;#034;Quat.W&amp;#034;, &amp;#034;RM11&amp;#034;,  &amp;#034;RM12&amp;#034;, &amp;#034;RM13&amp;#034;,&#xD;
&amp;gt; &amp;#034;RM21&amp;#034;, &amp;#034;RM22&amp;#034;, &amp;#034;RM23&amp;#034;, &amp;#034;RM31&amp;#034;, &amp;#034;RM32&amp;#034;, &amp;#034;RM33&amp;#034;, &amp;#034;GravAcc_X&amp;#034;,&#xD;
&amp;gt; &amp;#034;GravAcc_Y&amp;#034;, &amp;#034;GravAcc_Z&amp;#034;, &amp;#034;UserAcc_X&amp;#034;, &amp;#034;UserAcc_Y&amp;#034;,   &amp;#034;UserAcc_Z&amp;#034;,&#xD;
&amp;gt; &amp;#034;RotRate_X&amp;#034;, &amp;#034;RotRate_Y&amp;#034;, &amp;#034;RotRate_Z&amp;#034;, &amp;#034;MagHeading&amp;#034;, &amp;#034;TrueHeading&amp;#034;,&#xD;
&amp;gt; &amp;#034;HeadingAccuracy&amp;#034;, &amp;#034;MagX&amp;#034;, &amp;#034;MagY&amp;#034;, &amp;#034;MagZ&amp;#034;, &amp;#034;Lat&amp;#034;,   &amp;#034;Long&amp;#034;,&#xD;
&amp;gt; &amp;#034;LocAccuracy&amp;#034;, &amp;#034;Course&amp;#034;, &amp;#034;Speed&amp;#034;, &amp;#034;Altitude&amp;#034;, &amp;#034;Proximity&amp;#034;&#xD;
&#xD;
There are certainly any things that can and should be improved. The main problem seems to be that the data, even if sampled at 100Hz, is sent to the iPhone only every second or so. So it is not really real time. I hope that someone who is better at iPhone programming than I am - I am really rubbish at it- could help and write an iPhone program to stream the data in a more convenient way: one by one rather than in packets. &#xD;
&#xD;
There are many potential applications for this. Here are some I could come up with:&#xD;
&#xD;
 1. You can carry the iPhone around and measure your movements (acceleration). Attached to your hand you can measure your tremor. &#xD;
 2. The magnetometer is really cool. You can use it to find metal bars in the walls an also electric cables. &#xD;
 3. You can collect GPS data for all sorts of applications; there are ideas to use this for the detection of certain diseases. For example if it takes you longer than usual to find your car when you come from shopping that might hint at early stages of dementia --- or sleep deprivation.&#xD;
 4. When you put the phone on a machine, like a running motor, you can measure the vibrations. When you perform a frequency analysis you can check whether the motor runs alright.&#xD;
 5. Using the accelerometers I was able to measure my breathing (putting the phone on my chest).&#xD;
 &#xD;
I think that there might also be quite some potential for using the Wolfram Cloud here. Deploying a program in the cloud and reading from your phone is certainly quite interesting. The problem is that this particular app only works via WiFi. It would be nice to have one that works via 3G. &#xD;
&#xD;
So, in summary, it might be quite useful to use the iPhone&amp;#039;s sensors. The advantage is that nearly everyone carries a smartphone with them all the time. Making more of your smart phone&amp;#039;s sensors with Mathematica seems to be a nice playground for applications. I&amp;#039;d love to hear about your ideas...&#xD;
&#xD;
Cheers,&#xD;
&#xD;
Marco&#xD;
&#xD;
PS: When you are done with the streaming you should execute these commands:&#xD;
&#xD;
    (*Remove Scheduled Tasks and close link*)&#xD;
    RemoveScheduledTask[ScheduledTasks[]]; socket@close[];&#xD;
&#xD;
&#xD;
  [1]: http://community.wolfram.com/web/bschelter&#xD;
  [2]: https://itunes.apple.com/gb/app/sensor-data/id397619802?mt=8&#xD;
  [3]: /c/portal/getImageAttachment?filename=sensorwelcome.PNG&amp;amp;userId=48754&#xD;
  [4]: /c/portal/getImageAttachment?filename=sensorstreaming.PNG&amp;amp;userId=48754&#xD;
  [5]: /c/portal/getImageAttachment?filename=Allsensors.PNG&amp;amp;userId=48754&#xD;
  [6]: /c/portal/getImageAttachment?filename=Compass.gif&amp;amp;userId=48754&#xD;
  [7]: /c/portal/getImageAttachment?filename=Accelerometer.gif&amp;amp;userId=48754&#xD;
  [8]: /c/portal/getImageAttachment?filename=Iphonemovement.gif&amp;amp;userId=48754&#xD;
  [9]: /c/portal/getImageAttachment?filename=ScreenShot2014-09-15at23.52.46.png&amp;amp;userId=48754</description>
    <dc:creator>Marco Thiel</dc:creator>
    <dc:date>2014-09-15T23:53:14Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/789911">
    <title>Using Mathematica to see the world in a different light - part I</title>
    <link>https://community.wolfram.com/groups/-/m/t/789911</link>
    <description>The [international year of light][1] has just drawn to an end. From 4-6 February the [closing ceremony][2] took place in [Merida, Yukatan][3]. &#xD;
&#xD;
    GeoGraphics[GeoMarker[Entity[&amp;#034;City&amp;#034;, {&amp;#034;Merida&amp;#034;, &amp;#034;Yucatan&amp;#034;, &amp;#034;Mexico&amp;#034;}]], GeoRange -&amp;gt; Quantity[3000, &amp;#034;Kilometers&amp;#034;], GeoBackground -&amp;gt; &amp;#034;ReliefMap&amp;#034;]&#xD;
&#xD;
![enter image description here][4]&#xD;
&#xD;
The international Year of Light was a global initiative of the United Nations to celebrate light and light based technologies. Mathematica&amp;#039;s built-in Wikipedia data contains detailed information on the Year of Light; here is the first sentence of the article.  &#xD;
&#xD;
    TextSentences[WikipediaData[&amp;#034;Year of Light&amp;#034;]][[1]]&#xD;
&#xD;
&amp;gt; The International Year of Light and Light-based Technologies, 2015 &#xD;
&amp;gt; (IYL 2015) is a United Nations observance that aims to raise &#xD;
&amp;gt; awareness of the achievements of light science and its applications, &#xD;
&amp;gt; and its importance to humankind.&#xD;
&#xD;
I am planning to write three posts to show how the Wolfram Language, its wealth of data, and connected devices can be used to keep the year of light alive at your home. In this first part, I will use a spectrometer, connect it to the Wolfram Language, and try to &amp;#034;see the world in a different light&amp;#034;. It turns out that the Wolfram Language will be as important as the hardware for this project. &#xD;
&#xD;
Light is key to life on earth and sight is a key sense; most of the information our brains process comes from our vision. When the first organism developed primitive vision, that was an enormous evolutionary advantage, allowing them to escape preditors and localise prey. Light plays a crucial role in our modern lifes - most likely the information on this very website was delivered to you using optical fibres and light. Light also allows us to study everything from the smallest particles up to the farthest reaches of the universe.  &#xD;
&#xD;
In the 17th century Sir Isaac Newton introduced the word &amp;#034;spectrum&amp;#034; into optics, referring to the range of colours observed whey light passed through a prism. Today spectrometers are used in many scientific labs to study everything from molecules to the light of stars. In this Community several posts have described the construction of [spectrometers][5] and [raspberry pi spectrometers][6]. For this blog I will use a commercial, small spectrometer, the [C12666MA Micro-Spectrometer][7], that attaches to an [Arduino Uno][8]. Hooking it up to the Arduino is trivial as the pins nicely align. &#xD;
&#xD;
![enter image description here][9]&#xD;
&#xD;
I will first upload the following code to the Arduino:&#xD;
&#xD;
    // This code is a modified from the original sketch from Peter Jansen&#xD;
    // https://github.com/tricorderproject/arducordermini&#xD;
    // This code removes the external ADC and uses the internal ADC instead. &#xD;
    // also this code just prints the output to csv output to the terminal. &#xD;
    &#xD;
    #define SPEC_GAIN        A0&#xD;
    //#define SPEC_EOS         NA&#xD;
    #define SPEC_ST          A1&#xD;
    #define SPEC_CLK         A2&#xD;
    #define SPEC_VIDEO       A3&#xD;
    #define WHITE_LED        A4&#xD;
    #define LASER_404        A5&#xD;
    &#xD;
    #define SPEC_CHANNELS    256&#xD;
    uint16_t data[SPEC_CHANNELS];&#xD;
    &#xD;
    void setup() {&#xD;
    &#xD;
      //pinMode(SPEC_EOS, INPUT);&#xD;
      pinMode(SPEC_GAIN, OUTPUT);&#xD;
      pinMode(SPEC_ST, OUTPUT);&#xD;
      pinMode(SPEC_CLK, OUTPUT);&#xD;
    &#xD;
      pinMode(WHITE_LED, OUTPUT);&#xD;
      pinMode(LASER_404, OUTPUT);&#xD;
      digitalWrite(WHITE_LED, LOW);&#xD;
      digitalWrite(LASER_404, LOW);&#xD;
      &#xD;
      &#xD;
      //digitalWrite(WHITE_LED, HIGH);&#xD;
      //digitalWrite(LASER_404, HIGH);&#xD;
    &#xD;
      digitalWrite(SPEC_GAIN, HIGH);&#xD;
      digitalWrite(SPEC_ST, HIGH);&#xD;
      digitalWrite(SPEC_CLK, HIGH);&#xD;
      digitalWrite(SPEC_GAIN, HIGH); //LOW Gain&#xD;
      //digitalWrite(SPEC_GAIN, LOW); //High Gain&#xD;
    &#xD;
      //Serial.begin(9600);&#xD;
      Serial.begin(115200);&#xD;
    }&#xD;
    &#xD;
    void readSpectrometer()&#xD;
    {&#xD;
      //int delay_time = 35;     // delay per half clock (in microseconds).  This ultimately conrols the integration time.&#xD;
      int delay_time = 1;     // delay per half clock (in microseconds).  This ultimately conrols the integration time.&#xD;
      int idx = 0;&#xD;
      int read_time = 35;      // Amount of time that the analogRead() procedure takes (in microseconds) (different micros will have different times) &#xD;
      int intTime = 5; &#xD;
      int accumulateMode = false;&#xD;
      int i;&#xD;
    &#xD;
      // Step 1: start leading clock pulses&#xD;
      for (int i = 0; i &amp;lt; SPEC_CHANNELS; i++) {&#xD;
        digitalWrite(SPEC_CLK, LOW);&#xD;
        delayMicroseconds(delay_time);&#xD;
        digitalWrite(SPEC_CLK, HIGH);&#xD;
        delayMicroseconds(delay_time);&#xD;
      }&#xD;
    &#xD;
      // Step 2: Send start pulse to signal start of integration/light collection&#xD;
      digitalWrite(SPEC_CLK, LOW);&#xD;
      delayMicroseconds(delay_time);&#xD;
      digitalWrite(SPEC_CLK, HIGH);&#xD;
      digitalWrite(SPEC_ST, LOW);&#xD;
      delayMicroseconds(delay_time);&#xD;
      digitalWrite(SPEC_CLK, LOW);&#xD;
      delayMicroseconds(delay_time);&#xD;
      digitalWrite(SPEC_CLK, HIGH);&#xD;
      digitalWrite(SPEC_ST, HIGH);&#xD;
      delayMicroseconds(delay_time);&#xD;
    &#xD;
      // Step 3: Integration time -- sample for a period of time determined by the intTime parameter&#xD;
      int blockTime = delay_time * 8;&#xD;
      long int numIntegrationBlocks = ((long)intTime * (long)1000) / (long)blockTime;&#xD;
      for (int i = 0; i &amp;lt; numIntegrationBlocks; i++) {&#xD;
        // Four clocks per pixel&#xD;
        // First block of 2 clocks -- measurement&#xD;
        digitalWrite(SPEC_CLK, LOW);&#xD;
        delayMicroseconds(delay_time);&#xD;
        digitalWrite(SPEC_CLK, HIGH);&#xD;
        delayMicroseconds(delay_time);&#xD;
        digitalWrite(SPEC_CLK, LOW);&#xD;
        delayMicroseconds(delay_time);&#xD;
        digitalWrite(SPEC_CLK, HIGH);&#xD;
        delayMicroseconds(delay_time);&#xD;
    &#xD;
        digitalWrite(SPEC_CLK, LOW);&#xD;
        delayMicroseconds(delay_time);&#xD;
        digitalWrite(SPEC_CLK, HIGH);&#xD;
        delayMicroseconds(delay_time);&#xD;
        digitalWrite(SPEC_CLK, LOW);&#xD;
        delayMicroseconds(delay_time);&#xD;
        digitalWrite(SPEC_CLK, HIGH);&#xD;
        delayMicroseconds(delay_time);&#xD;
      }&#xD;
    &#xD;
    &#xD;
      // Step 4: Send start pulse to signal end of integration/light collection&#xD;
      digitalWrite(SPEC_CLK, LOW);&#xD;
      delayMicroseconds(delay_time);&#xD;
      digitalWrite(SPEC_CLK, HIGH);&#xD;
      digitalWrite(SPEC_ST, LOW);&#xD;
      delayMicroseconds(delay_time);&#xD;
      digitalWrite(SPEC_CLK, LOW);&#xD;
      delayMicroseconds(delay_time);&#xD;
      digitalWrite(SPEC_CLK, HIGH);&#xD;
      digitalWrite(SPEC_ST, HIGH);&#xD;
      delayMicroseconds(delay_time);&#xD;
    &#xD;
      // Step 5: Read Data 2 (this is the actual read, since the spectrometer has now sampled data)&#xD;
      idx = 0;&#xD;
      for (int i = 0; i &amp;lt; SPEC_CHANNELS; i++) {&#xD;
        // Four clocks per pixel&#xD;
        // First block of 2 clocks -- measurement&#xD;
        digitalWrite(SPEC_CLK, LOW);&#xD;
        delayMicroseconds(delay_time);&#xD;
        digitalWrite(SPEC_CLK, HIGH);&#xD;
        delayMicroseconds(delay_time);&#xD;
        digitalWrite(SPEC_CLK, LOW);&#xD;
    &#xD;
        // Analog value is valid on low transition&#xD;
        if (accumulateMode == false) {&#xD;
          data[idx] = analogRead(SPEC_VIDEO);&#xD;
        } else {&#xD;
          data[idx] += analogRead(SPEC_VIDEO);&#xD;
        }&#xD;
        idx += 1;&#xD;
        if (delay_time &amp;gt; read_time) delayMicroseconds(delay_time - read_time);   // Read takes about 135uSec&#xD;
    &#xD;
        digitalWrite(SPEC_CLK, HIGH);&#xD;
        delayMicroseconds(delay_time);&#xD;
    &#xD;
        // Second block of 2 clocks -- idle&#xD;
        digitalWrite(SPEC_CLK, LOW);&#xD;
        delayMicroseconds(delay_time);&#xD;
        digitalWrite(SPEC_CLK, HIGH);&#xD;
        delayMicroseconds(delay_time);&#xD;
        digitalWrite(SPEC_CLK, LOW);&#xD;
        delayMicroseconds(delay_time);&#xD;
        digitalWrite(SPEC_CLK, HIGH);&#xD;
        delayMicroseconds(delay_time);&#xD;
      }&#xD;
    &#xD;
      // Step 6: trailing clock pulses&#xD;
      for (int i = 0; i &amp;lt; SPEC_CHANNELS; i++) {&#xD;
        digitalWrite(SPEC_CLK, LOW);&#xD;
        delayMicroseconds(delay_time);&#xD;
        digitalWrite(SPEC_CLK, HIGH);&#xD;
        delayMicroseconds(delay_time);&#xD;
      }&#xD;
    }&#xD;
    &#xD;
    void print_data()&#xD;
    {&#xD;
      for (int i = 0; i &amp;lt; SPEC_CHANNELS; i++) &#xD;
      {&#xD;
        Serial.print(data[i]);&#xD;
        Serial.print(&amp;#039;,&amp;#039;);&#xD;
      }&#xD;
      Serial.print(&amp;#034;\n&amp;#034;);&#xD;
    }&#xD;
    &#xD;
    void loop() &#xD;
    {&#xD;
    //  digitalWrite(LASER_404, HIGH);&#xD;
    //  readSpectrometer();&#xD;
    //  digitalWrite(LASER_404, LOW);&#xD;
    //  print_data();&#xD;
    //  delay(10);&#xD;
      &#xD;
    //  digitalWrite(WHITE_LED, HIGH);&#xD;
    //  readSpectrometer();&#xD;
    //  digitalWrite(WHITE_LED, LOW);&#xD;
    //  print_data();&#xD;
    //  delay(10);&#xD;
      &#xD;
      readSpectrometer();&#xD;
      print_data();&#xD;
      delay(10);   &#xD;
    }&#xD;
&#xD;
&#xD;
The Arduino code is quite long and I first thought to only attach it to the post, but there is a particular line which is important. In the &amp;#034;void readSpectrometer()&amp;#034; part there is the line&#xD;
&#xD;
    int intTime = 5; &#xD;
&#xD;
which allows you to set the integration time, i.e. the exposure time. That is a very important parameter to get optimal results. With the new features of the *Wolfram Language* and in particular with the [Arduino Device Connection][10] it is possible to adjust the exposure from within the Wolfram Language code. For this post this is not really necessary, so I will save that for another day. &#xD;
&#xD;
Connecting the Arduino-Spectrometer duo to Mathematica is trivial. Here I connect it to an OSX system.&#xD;
&#xD;
    mySpectrometer = DeviceOpen[&amp;#034;Serial&amp;#034;, {Quiet[FileNames[&amp;#034;tty.usb*&amp;#034;, {&amp;#034;/dev&amp;#034;}, Infinity]][[1]], &amp;#034;BaudRate&amp;#034; -&amp;gt; 115200}]&#xD;
&#xD;
This piece of code&#xD;
&#xD;
    Quiet[FileNames[&amp;#034;tty.usb*&amp;#034;, {&amp;#034;/dev&amp;#034;}, Infinity]][[1]]&#xD;
&#xD;
facilitates the connection as it detects the correct device automatically. We can now start collecting data like so:&#xD;
&#xD;
    data = Table[Pause[2]; ToExpression /@ StringSplit[FromCharacterCode[SplitBy[DeviceReadBuffer[mySpectrometer], # == 10 &amp;amp;][[-2]]], &amp;#034;,&amp;#034;], {i, 6}];&#xD;
&#xD;
This data aquisition code actually measures 6 spectra and pauses for 2 seconds between individual measurements. These repeated measurements decrease the noise of the measurements. The measurement procedure is very straight foward. You point the spectrometer at an object and execute the data aquisition. Let&amp;#039;s fist look at the spectrum of a fluorescent light bulb. &#xD;
&#xD;
    ListLinePlot[N@Mean[Select[data, Length[#] == 256 &amp;amp;]], PlotRange -&amp;gt; All]&#xD;
&#xD;
![enter image description here][11]&#xD;
&#xD;
There are several peaks that we will try to understand a little bit later. The x-axis shows 256 bins which represent the different colours, i.e. frequencies. The y-axis shows the count, i.e. the intensity of that frequency band. In oder to be able to interpret the results we first need to calibrate the spectrometer. Here is a link to a table which contains calibration information for various versions of the spectrometer. It turns out that the calibration is performed using a 5th order polynomial; the repective coefficients are given in [the calibration table][12]. For my particular spectrometer I obtain:&#xD;
&#xD;
    a0 = 3.170083173*10^2;&#xD;
    b1 = 2.39519817;&#xD;
    b2 = -8.618615345*10^(-4);&#xD;
    b3 = -5.978279712*10^(-6);&#xD;
    b4 = 8.585352787*10^(-9);&#xD;
    b5 = -2.048534811*10^(-12);&#xD;
    wavelength[x_] := a0 + b1  x + b2  x^2 + b3  x^3 + b4*x^4 + b5  x^5&#xD;
&#xD;
Here is a plot of the calibration curve:&#xD;
&#xD;
    Plot[{wavelength[x]}, {x, 0, 256}]&#xD;
&#xD;
![enter image description here][13]&#xD;
&#xD;
It transforms the number of the bin to the corresponding wavelength. We can not plot the spectrum with the correct x-axis.&#xD;
&#xD;
    datacalibrated = Transpose@{wavelength /@ Range[256], N@Mean[Select[data, Length[#] == 256 &amp;amp;]]};&#xD;
    ListLinePlot[datacalibrated, PlotRange -&amp;gt; All]&#xD;
&#xD;
![enter image description here][14]&#xD;
&#xD;
This is much better. This is how a professional spectrum of a fluorescent light bulb looks like:&#xD;
&#xD;
![enter image description here][15]&#xD;
&#xD;
which is taken from the [Wikipedia commons][16]. Here is a list of the peaks and what element they correspond to:&#xD;
&#xD;
![enter image description here][17]&#xD;
&#xD;
We clearly see the peaks for Mercury, Terbium and Europium. Wolfram|Alpha has a wealth of information about spectral lines and we can use the following line to get it.&#xD;
&#xD;
    WolframAlpha[&amp;#034;spectral lines mercury&amp;#034;]&#xD;
&#xD;
![enter image description here][18]&#xD;
&#xD;
The two dominant lines here are the one at 4046.565 Angstrom and at 4358.335 Angstrom. They correspond to our lines at about 405 nm and 436 nm. Unfortunately, I have failed to extract a list of all relevant spectral lines from Wolfram|Alpha.&#xD;
&#xD;
Let&amp;#039;s try to spice our representation of the spectrum up a bit. It would be nice to have a visual cue as to the colour the different wavelengths correspond to. Mathematica and the Wolfram Language have everything built in to make this task really easy:&#xD;
&#xD;
    ColorData[&amp;#034;VisibleSpectrum&amp;#034;][#] &amp;amp; /@ datacalibrated[[All, 1]]&#xD;
&#xD;
![enter image description here][19]&#xD;
&#xD;
We can now merge this into a band of colours that we can plot with the spectrum.&#xD;
&#xD;
    Graphics[Table[{ColorData[&amp;#034;VisibleSpectrum&amp;#034;][datacalibrated[[i, 1]]], Rectangle[{datacalibrated[[i, 1]], 0}, {datacalibrated[[i + 1, 1]],40}]}, {i, 1, Length[datacalibrated] - 1}]]&#xD;
&#xD;
![enter image description here][20]&#xD;
&#xD;
If we plot this together things start being easier to interpret. &#xD;
&#xD;
    Show[ListLinePlot[datacalibrated, PlotRange -&amp;gt; All], Graphics[Table[{ColorData[&amp;#034;VisibleSpectrum&amp;#034;][datacalibrated[[i, 1]]],&#xD;
         Rectangle[{datacalibrated[[i, 1]], 10}, {datacalibrated[[i + 1, 1]], 40}]}, {i, 1, Length[datacalibrated] - 1}]]]&#xD;
&#xD;
![enter image description here][21]&#xD;
&#xD;
Now there are two more things I want to tweak. First of all there is this off-set of 54; that is a &amp;#034;zero count&amp;#034;,  i.e. I obtain at least 54 even if there is no signal so I need to subtract that. This number depends on the spectrometer that you have. On a second one I have got that value is different. Also, I would like the lightcurve itself to reflect the colour. The following code achieves that:&#xD;
&#xD;
    Show[ListLinePlot[Evaluate@(Plus[{0., -54.}, #] &amp;amp; /@ datacalibrated), &#xD;
      PlotRange -&amp;gt; {All, {-30, All}}, Joined -&amp;gt; True, Frame -&amp;gt; True, &#xD;
      ColorFunction -&amp;gt; (Blend[&amp;#034;VisibleSpectrum&amp;#034;, #1* Differences[datacalibrated[[All, 1]][[{1, -1}]]][[1]] + &#xD;
      datacalibrated[[1, 1]]] &amp;amp;), Filling -&amp;gt; Axis, LabelStyle -&amp;gt; Directive[Black, Bold, Medium], &#xD;
      FrameLabel -&amp;gt; {&amp;#034;Wavelength (nm)&amp;#034;, &amp;#034;Intensity&amp;#034;}], Graphics[Table[{ColorData[&amp;#034;VisibleSpectrum&amp;#034;][datacalibrated[[i, 1]]],&#xD;
         Rectangle[{datacalibrated[[i, 1]], -30}, {datacalibrated[[i + 1, 1]], -10}]}, {i, 1, Length[datacalibrated] - 1}]]]&#xD;
&#xD;
![enter image description here][22]&#xD;
&#xD;
To identify the spectral lines it is useful to identify maxima of the curve. The following function helps to achieve that:&#xD;
&#xD;
    localMaxPositions = &#xD;
      Compile[{{pts, _Real, 1}}, &#xD;
       Module[{result = Table[0, {Length[pts]}], i = 1, ctr = 0}, For[i = 2, i &amp;lt; Length[pts], i++, &#xD;
         If[pts[[i - 1]] &amp;lt; pts[[i]] &amp;amp;&amp;amp; pts[[i + 1]] &amp;lt; pts[[i]],result[[++ctr]] = i]];Take[result, ctr]]];&#xD;
&#xD;
We can now locate the maxima and plot that on the curve:&#xD;
&#xD;
    dplot = ListLinePlot[datacalibrated, PlotRange -&amp;gt; All];&#xD;
    maxs = ListPlot[Select[Nest[#[[localMaxPositions[#[[All, 2]]]]] &amp;amp;, datacalibrated, 1], #[[2]] &amp;gt; 58.33 &amp;amp;], PlotStyle -&amp;gt; Directive[PointSize[0.015], Green]];&#xD;
    Show[{dplot, maxs}]&#xD;
&#xD;
![enter image description here][23]&#xD;
&#xD;
The following function produces a list of the positons of the maxima:&#xD;
&#xD;
    Select[Nest[#[[localMaxPositions[#[[All, 2]]]]] &amp;amp;, datacalibrated, 1], #[[2]] &amp;gt; 58.33 &amp;amp;][[All, 1]]&#xD;
    (*{366.873694406445`,406.4710188519941`,436.1860721777116`,489.\&#xD;
    5449068238321`,544.8439203380104`,587.3854766583466`,610.\&#xD;
    7844904587837`,668.0095922519685`,688.8768717778272`,707.\&#xD;
    3639428495288`}*)&#xD;
&#xD;
It turns out that there is a packages which contains information about spectral lines, but the elements that we are interested in (Hg, Te, Eu) are not in the database. For elements that are in the database we can compare the measured lines to the ones in the database:&#xD;
&#xD;
    &amp;lt;&amp;lt; ResonanceAbsorptionLines`&#xD;
    ElementAbsorptionMap[Na]&#xD;
&#xD;
![enter image description here][24]&#xD;
&#xD;
What we can do, however, is to generate - or simulate- an approximatio of the spectral lines we measure. There is a [discussion on Stackexchange][25] that shows how to plot an emission spectrum. The following lines are taken from that discussion. &#xD;
&#xD;
    spec[wavelength_, width_] := Flatten[Table[{{x, 0, x}, {x, 1, x}}, {x, wavelength - width, wavelength + width, 0.1}], 1];&#xD;
    ListDensityPlot[&#xD;
     spec[#, 1] &amp;amp; /@ &#xD;
      Select[Nest[#[[localMaxPositions[#[[All, 2]]]]] &amp;amp;, datacalibrated, &#xD;
         1], #[[2]] &amp;gt; 59.33 &amp;amp;][[All, 1]], &#xD;
     ColorFunction -&amp;gt; ColorData[&amp;#034;VisibleSpectrum&amp;#034;], &#xD;
     ColorFunctionScaling -&amp;gt; False, AspectRatio -&amp;gt; .3, &#xD;
     PlotRange -&amp;gt; {{300, 800}}, &#xD;
     FrameTicks -&amp;gt; {Automatic, None, None, None}, &#xD;
     FrameTicksStyle -&amp;gt; White, Frame -&amp;gt; True, Background -&amp;gt; Black]&#xD;
&#xD;
![enter image description here][26]&#xD;
&#xD;
Case Studies&#xD;
============&#xD;
&#xD;
Blue sky&#xD;
--------&#xD;
&#xD;
I now measure the spectrum of the blue sky. The measurement was taken late in the year and the sky wasn&amp;#039;t the &amp;#034;bluest of blues&amp;#034;, but we can still make out some interesting features.&#xD;
&#xD;
    mySpectrometer = &#xD;
      DeviceOpen[&#xD;
       &amp;#034;Serial&amp;#034;, {Quiet[FileNames[&amp;#034;tty.usb*&amp;#034;, {&amp;#034;/dev&amp;#034;}, Infinity]][[1]], &#xD;
        &amp;#034;BaudRate&amp;#034; -&amp;gt; 115200}];&#xD;
    datasky = &#xD;
      Table[Pause[2]; &#xD;
       ToExpression /@ &#xD;
        StringSplit[&#xD;
         FromCharacterCode[&#xD;
          SplitBy[DeviceReadBuffer[mySpectrometer], # == 10 &amp;amp;][[-2]]], &#xD;
         &amp;#034;,&amp;#034;], {i, 6}];&#xD;
    a0 = 3.170083173*10^2;&#xD;
    b1 = 2.39519817;&#xD;
    b2 = -8.618615345*10^(-4);&#xD;
    b3 = -5.978279712*10^(-6);&#xD;
    b4 = 8.585352787*10^(-9);&#xD;
    b5 = -2.048534811*10^(-12);&#xD;
    wavelength[x_] := a0 + b1  x + b2  x^2 + b3  x^3 + b4*x^4 + b5  x^5;&#xD;
    datacalibratedsky = &#xD;
      Transpose@{wavelength /@ Range[256], N@Mean[Select[datasky, Length[#] == 256 &amp;amp;]]};&#xD;
    Show[ListLinePlot[Evaluate@(Plus[{0., -54.}, #] &amp;amp; /@ datacalibratedsky), &#xD;
      PlotRange -&amp;gt; {All, {-30, All}}, Joined -&amp;gt; True, Frame -&amp;gt; True, ColorFunction -&amp;gt; (Blend[&amp;#034;VisibleSpectrum&amp;#034;, #1*Differences[datacalibratedsky[[All, 1]][[{1, -1}]]][[1]] + datacalibratedsky[[1, 1]]] &amp;amp;), Filling -&amp;gt; Axis, LabelStyle -&amp;gt; Directive[Black, Bold, Medium], FrameLabel -&amp;gt; {&amp;#034;Wavelength (nm)&amp;#034;, &amp;#034;Intensity&amp;#034;}], &#xD;
     Graphics[Table[{ColorData[&amp;#034;VisibleSpectrum&amp;#034;][datacalibratedsky[[i, 1]]], Rectangle[{datacalibratedsky[[i, 1]], -30}, {datacalibratedsky[[i + 1, 1]], -10}]}, {i, 1, Length[datacalibratedsky] - 1}]]]&#xD;
&#xD;
![enter image description here][27]&#xD;
&#xD;
The spectrum clearly shows the &amp;#034;blue&amp;#034; in the sky. It turns out that the marked double dip corresponds to absorption by water. &#xD;
&#xD;
Lasers&#xD;
------&#xD;
&#xD;
Using exactly the same code we can now analyse the spectrum of different lasers - red, green and blue. We can clearly see that the red and green lasers have a very narrow peak within the red and green spectral bands, whereas the blue laser has a much broader peak. This is probably related to how blue laser light is generateed in cheap laser pointers. &#xD;
&#xD;
![enter image description here][28]&#xD;
&#xD;
Here is how the spectrum of a red laser would look like:&#xD;
&#xD;
    ListDensityPlot[&#xD;
     spec[#, 1] &amp;amp; /@ &#xD;
      Select[Nest[#[[localMaxPositions[#[[All, 2]]]]] &amp;amp;, datacalibratedlaserr, 1], #[[2]] &amp;gt; 59.33 &amp;amp;][[All, 1]], &#xD;
     ColorFunction -&amp;gt; ColorData[&amp;#034;VisibleSpectrum&amp;#034;], ColorFunctionScaling -&amp;gt; False, AspectRatio -&amp;gt; .3, &#xD;
     PlotRange -&amp;gt; {{400, 800}}, FrameTicks -&amp;gt; {Automatic, None, None, None}, &#xD;
     FrameTicksStyle -&amp;gt; White, Frame -&amp;gt; True, Background -&amp;gt; Black] &#xD;
&#xD;
![enter image description here][29]&#xD;
&#xD;
Absorption spectrum - green leaf&#xD;
--------------------------------&#xD;
&#xD;
Up to now we have maily discussed emission spectra - apart from some features of the blue sky example. Let&amp;#039;s now generate an absorption spectrum. In an emission spectrum a material actively emits radiation/light. In an absoprtion spectrum light that passes through the material. &#xD;
&#xD;
![enter image description here][30]&#xD;
&#xD;
In order to produce a good absorption spectrum we would ideally use a light source that produces a strong continuous spectrum. When I conducted the leaf experiment I only had a very cheap lamp which is usually used by doctors for initial examinations:&#xD;
&#xD;
![enter image description here][31]&#xD;
&#xD;
I pointed the lamp at a green leaf and measured the light that shone through. The problem was that the lamp did produce a very poor spectrum heavily biased towards the red frequency range. So in this case study I will first measure the emission spectrum of the lamp and then normalise the absorption spectrum of the leaf. Let&amp;#039;s start with the lamp.&#xD;
&#xD;
    mySpectrometer = &#xD;
      DeviceOpen[&#xD;
       &amp;#034;Serial&amp;#034;, {Quiet[FileNames[&amp;#034;tty.usb*&amp;#034;, {&amp;#034;/dev&amp;#034;}, Infinity]][[1]], &#xD;
        &amp;#034;BaudRate&amp;#034; -&amp;gt; 115200}];&#xD;
    datalamp = &#xD;
      Table[Pause[2]; &#xD;
       ToExpression /@ &#xD;
        StringSplit[&#xD;
         FromCharacterCode[&#xD;
          SplitBy[DeviceReadBuffer[mySpectrometer], # == 10 &amp;amp;][[-2]]], &#xD;
         &amp;#034;,&amp;#034;], {i, 6}];&#xD;
    a0 = 3.170083173*10^2;&#xD;
    b1 = 2.39519817;&#xD;
    b2 = -8.618615345*10^(-4);&#xD;
    b3 = -5.978279712*10^(-6);&#xD;
    b4 = 8.585352787*10^(-9);&#xD;
    b5 = -2.048534811*10^(-12);&#xD;
    wavelength[x_] := a0 + b1  x + b2  x^2 + b3  x^3 + b4*x^4 + b5  x^5;&#xD;
    datacalibratedlamp = &#xD;
      Transpose@{wavelength /@ Range[256], N@Mean[Select[datalamp, Length[#] == 256 &amp;amp;]]};&#xD;
    Show[ListLinePlot[Evaluate@(Plus[{0., -54.}, #] &amp;amp; /@ datacalibratedlamp), &#xD;
      PlotRange -&amp;gt; {All, {-30, All}}, Joined -&amp;gt; True, Frame -&amp;gt; True, ColorFunction -&amp;gt; (Blend[&amp;#034;VisibleSpectrum&amp;#034;, #1*Differences[datacalibratedlamp[[All, 1]][[{1, -1}]]][[1]] + datacalibratedlamp[[1, 1]]] &amp;amp;), Filling -&amp;gt; Axis, &#xD;
      LabelStyle -&amp;gt; Directive[Black, Bold, Medium], &#xD;
      FrameLabel -&amp;gt; {&amp;#034;Wavelength (nm)&amp;#034;, &amp;#034;Intensity&amp;#034;}], &#xD;
     Graphics[Table[{ColorData[&amp;#034;VisibleSpectrum&amp;#034;][datacalibratedlamp[[i, 1]]], &#xD;
     Rectangle[{datacalibratedlamp[[i,1]], -30}, {datacalibratedlamp[[i + 1, 1]], -10}]}, {i, 1,Length[datacalibratedlamp] - 1}]]]&#xD;
&#xD;
![enter image description here][32]&#xD;
&#xD;
The spectrum is clearly biased to the red freqencies. In the blue/ultraviolet range the light source produces to little output that it will be impossible to determine a reasonable absorption spectrum there. Let&amp;#039;s go on to measure the absorption of the leaf.&#xD;
&#xD;
    absorptionleaf = &#xD;
      Transpose[{datacalibratedleaf[[All, 1]], (datacalibratedleaf[[All, 2]]/datacalibratedlamp[[All, 2]])}];&#xD;
    Show[ListLinePlot[Evaluate@(Plus[{0., 0.}, #] &amp;amp; /@ absorptionleaf), PlotRange -&amp;gt; {All, {-0.3, 1}}, Joined -&amp;gt; True, Frame -&amp;gt; True, &#xD;
      ColorFunction -&amp;gt; (Blend[&amp;#034;VisibleSpectrum&amp;#034;, #1*Differences[absorptionleaf[[All, 1]][[{1, -1}]]][[1]] + absorptionleaf[[1, 1]]] &amp;amp;), Filling -&amp;gt; Axis, &#xD;
      LabelStyle -&amp;gt; Directive[Black, Bold, Medium], FrameLabel -&amp;gt; {&amp;#034;Wavelength (nm)&amp;#034;, &amp;#034;Intensity&amp;#034;}], Graphics[Table[{ColorData[&amp;#034;VisibleSpectrum&amp;#034;][absorptionleaf[[i, 1]]],Rectangle[{absorptionleaf[[i, 1]], -0.3}, {absorptionleaf[[i + 1,1]], -.1}]}, {i, 1, Length[absorptionleaf] - 1}]]]&#xD;
&#xD;
![enter image description here][33]&#xD;
&#xD;
The absorption in the red frequency range stems from chlorophyll; chlorophyll also absorbs in the 400-450nm range which is hard to se here, because of our poor light source. The dip at around 500nm is carotenoids. You can compare that to the [spectrum of leaves on this website][34]. &#xD;
&#xD;
&amp;#034;Black body radiation&amp;#034; - an old fashioned light bulb&#xD;
----------------------------------------------------&#xD;
&#xD;
The final example will be of an old fashioned, small light bulb. &#xD;
&#xD;
![enter image description here][35]&#xD;
&#xD;
I will use a bench supply to slowly increase the voltage and the current. The filament will go from a red glowing colour to brighter &amp;#034;whiter&amp;#034; colour, but we will see that even for the highest voltage of 12V the spectrum will still be quite different from &amp;#034;white&amp;#034;, i.e. uniform. The code is just the same as above. I save all plots for voltages 1V to 12V with increments of 1V in one variable:&#xD;
&#xD;
    specall = {spec1V, spec2V, spec3V, spec4V, spec5V, spec6V, spec7V, spec8V, spec9V, spec10V, spec11V, spec12V}&#xD;
&#xD;
This can be easily plotted like so:&#xD;
&#xD;
    Grid[Partition[specall, 4]]&#xD;
&#xD;
![enter image description here][36]&#xD;
&#xD;
We can also animate this:&#xD;
&#xD;
    ListAnimate[specall, AnimationRunTime -&amp;gt; 10]&#xD;
&#xD;
![enter image description here][37]&#xD;
&#xD;
Note that from 9V onwards the spectrum has max-ed out. When I measured the spectra I took note of the voltage and the corresponding current:&#xD;
&#xD;
    voltageamp = {{1, 0.04}, {2, 0.06}, {3, 0.07}, {4, 0.08}, {5, 0.1}, {6, 0.11}, {7, 0.12}, {8, 0.13}, {9, 0.14}, {10, 0.14}, {11, 0.15}, {12, 0.16}}&#xD;
&#xD;
I can now represent that with the regression line. &#xD;
&#xD;
    Show[ListPlot[voltageamp, AxesLabel -&amp;gt; {&amp;#034;Volts&amp;#034;, &amp;#034;Amps&amp;#034;}, LabelStyle -&amp;gt; Directive[Bold, Medium]], &#xD;
     Plot[Evaluate@Fit[voltageamp, {1, x}, x], {x, 0, 12}, PlotStyle -&amp;gt; Red]]&#xD;
&#xD;
![enter image description here][38]&#xD;
&#xD;
Note the nearly linear increase in current as the voltage increases. Finally, I took photos as the voltage increased. &#xD;
&#xD;
![enter image description here][39]&#xD;
&#xD;
We now add a little bit of motion to the voltage amp graph:&#xD;
&#xD;
    figvoltsamp = &#xD;
      Evaluate@Table[Show[ListPlot[voltageamp, AxesLabel -&amp;gt; {&amp;#034;Volts&amp;#034;, &amp;#034;Amps&amp;#034;}, LabelStyle -&amp;gt; Directive[Bold, Medium], &#xD;
          Epilog -&amp;gt; {PointSize[Large], Green, Point[voltageamp[[i]]]}], Plot[Evaluate@Fit[voltageamp, {1, x}, x], {x, 0, 12}, PlotStyle -&amp;gt; Red]], {i, 1, 12}];&#xD;
&#xD;
To finish everything off, we can now animate this:&#xD;
&#xD;
    ListAnimate[GraphicsRow[#, ImageSize -&amp;gt; Full] &amp;amp; /@ Transpose[{specall, bulb, figvoltsamp}]]&#xD;
&#xD;
![enter image description here][40]&#xD;
&#xD;
If we were to assume that this is the radiation of a black body, we could use Planck&amp;#039;s law:&#xD;
&#xD;
    FormulaData[{&amp;#034;PlanckRadiationLaw&amp;#034;, &amp;#034;Wavelength&amp;#034;}]&#xD;
&#xD;
![enter image description here][41]&#xD;
&#xD;
 We could plot this as a function of temperature and wavelength&#xD;
   &#xD;
&#xD;
    equation = &#xD;
      FormulaData[{&amp;#034;PlanckRadiationLaw&amp;#034;, &amp;#034;Wavelength&amp;#034;}, {&amp;#034;T&amp;#034; -&amp;gt; Quantity[t, &amp;#034;Kelvins&amp;#034;], &#xD;
        &amp;#034;\[Lambda]&amp;#034; -&amp;gt; Quantity[l, &amp;#034;Nanometers&amp;#034;]}];&#xD;
    Plot3D[Quantity[1.191042*^29, (&amp;#034;Pascals&amp;#034;)/(&amp;#034;Seconds&amp;#034;)]/(-1.`16.255 l^5 + 2.71828^(1.438*^7/(l t)) l^5), {l, 300, 808}, {t, 1000, 5000}, PlotRange -&amp;gt; All, &#xD;
     AxesLabel -&amp;gt; {&amp;#034;wavelength&amp;#034;, &amp;#034;temperature&amp;#034;, &amp;#034;luminosity&amp;#034;}, LabelStyle -&amp;gt; Directive[Bold, Medium], ImageSize -&amp;gt; Large]&#xD;
&#xD;
![enter image description here][42]&#xD;
&#xD;
We can also write a little loop to calculate for which wavelength the maximum for different temperatures is reached.&#xD;
&#xD;
    results = {}; Monitor[&#xD;
     Table[equation2 = FormulaData[{&amp;#034;PlanckRadiationLaw&amp;#034;, &#xD;
     &amp;#034;Wavelength&amp;#034;}, {&amp;#034;T&amp;#034; -&amp;gt; Quantity[t, &amp;#034;Kelvins&amp;#034;], &amp;#034;\[Lambda]&amp;#034; -&amp;gt; Quantity[l, &amp;#034;Nanometers&amp;#034;]}]; &#xD;
      root = FindRoot[D[equation2[[2, 2]], l] == 0, {l, 700}, MaxIterations -&amp;gt; Infinity]; &#xD;
      AppendTo[results, {t, root}], {t, 2800, 3400, 100}], t]&#xD;
&#xD;
This gives the following table:&#xD;
&#xD;
    Grid[Join[{{&amp;#034;Temperature K&amp;#034;, &amp;#034;\[Lambda] at max&amp;#034;}}, Transpose[{results[[All, 1]], results[[All, 2, -1, 2]]}]], Frame -&amp;gt; All]&#xD;
&#xD;
![enter image description here][43]&#xD;
&#xD;
We can plot the relationship between maxium and temperature.&#xD;
&#xD;
    ListLinePlot[Transpose[{results[[All, 1]], results[[All, 2, -1, 2]]}],Mesh -&amp;gt; Full, MeshStyle -&amp;gt; Red, &#xD;
     AxesLabel -&amp;gt; {&amp;#034;Temperature&amp;#034;, &amp;#034;wavelength at max&amp;#034;}, LabelStyle -&amp;gt; Directive[Bold, Medium], ImageSize -&amp;gt; Large]&#xD;
&#xD;
![enter image description here][44]&#xD;
&#xD;
This allows us in principle to estimate the temperature of the filament. Note, that for 8V the maximum is at about 750nm. The graph shows that this corresponds to a temperature higher than 3400K, which is too high for such a small light bulb. So there is still quite some room for improvement... The principle of temperature measurement, by the colour of the sample is realistic. &#xD;
&#xD;
There are many further projects one could think of. I suppose that with a decent telescope it should be possible to analyse the light of stars for example. Also, the spectrometer works nicely on a Raspberry Pi. It should be quite straight forward to take measurements of say the sky over the day and see how the dominant colours change. &#xD;
&#xD;
Cheers,&#xD;
&#xD;
Marco&#xD;
&#xD;
&#xD;
  [1]: http://www.light2015.org/Home.html&#xD;
  [2]: http://www.light2015.org/Home/About/Latest-News/February2016/The-International-Year-of-Light-and-Light-based-Technologies-2015-Closing-Ceremony.html&#xD;
  [3]: http://www.wolframalpha.com/input/?i=Merida%20Yukatan&#xD;
  [4]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.09.22.png&amp;amp;userId=48754&#xD;
  [5]: http://community.wolfram.com/groups/-/m/t/178634&#xD;
  [6]: http://community.wolfram.com/groups/-/m/t/201504&#xD;
  [7]: https://www.tindie.com/products/PureEngineering/arduino-c12666ma-micro-spectrometer-/&#xD;
  [8]: https://www.arduino.cc/en/Main/ArduinoBoardUno&#xD;
  [9]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.12.11.png&amp;amp;userId=48754&#xD;
  [10]: https://reference.wolfram.com/language/ref/device/Arduino.html&#xD;
  [11]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.14.33.png&amp;amp;userId=48754&#xD;
  [12]: https://files.groupgets.com/hamamatsu/uspectrometer/Hama-Data-Camp-7.pdf&#xD;
  [13]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.17.28.png&amp;amp;userId=48754&#xD;
  [14]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.18.21.png&amp;amp;userId=48754&#xD;
  [15]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.19.12.png&amp;amp;userId=48754&#xD;
  [16]: https://commons.wikimedia.org/wiki/File:Fluorescent_lighting_spectrum_peaks_labelled.svg&#xD;
  [17]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.20.45.png&amp;amp;userId=48754&#xD;
  [18]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.21.48.png&amp;amp;userId=48754&#xD;
  [19]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.22.53.png&amp;amp;userId=48754&#xD;
  [20]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.23.52.png&amp;amp;userId=48754&#xD;
  [21]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.24.51.png&amp;amp;userId=48754&#xD;
  [22]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.26.07.png&amp;amp;userId=48754&#xD;
  [23]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.27.39.png&amp;amp;userId=48754&#xD;
  [24]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.28.48.png&amp;amp;userId=48754&#xD;
  [25]: http://mathematica.stackexchange.com/questions/85990/how-to-plot-an-emission-spectrum&#xD;
  [26]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.30.19.png&amp;amp;userId=48754&#xD;
  [27]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.33.08.png&amp;amp;userId=48754&#xD;
  [28]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.34.36.png&amp;amp;userId=48754&#xD;
  [29]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.36.25.png&amp;amp;userId=48754&#xD;
  [30]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.37.13.png&amp;amp;userId=48754&#xD;
  [31]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.37.43.png&amp;amp;userId=48754&#xD;
  [32]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.39.26.png&amp;amp;userId=48754&#xD;
  [33]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.41.03.png&amp;amp;userId=48754&#xD;
  [34]: http://socratic.org/questions/how-does-the-visible-light-spectrum-relate-to-photosynthesis&#xD;
  [35]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.42.16.png&amp;amp;userId=48754&#xD;
  [36]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.43.25.png&amp;amp;userId=48754&#xD;
  [37]: http://community.wolfram.com//c/portal/getImageAttachment?filename=spectrumanim.gif&amp;amp;userId=48754&#xD;
  [38]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.46.07.png&amp;amp;userId=48754&#xD;
  [39]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.46.53.png&amp;amp;userId=48754&#xD;
  [40]: http://community.wolfram.com//c/portal/getImageAttachment?filename=FinalAnimSpectrometer.gif&amp;amp;userId=48754&#xD;
  [41]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.49.46.png&amp;amp;userId=48754&#xD;
  [42]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.51.00.png&amp;amp;userId=48754&#xD;
  [43]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.52.31.png&amp;amp;userId=48754&#xD;
  [44]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2016-02-09at01.53.27.png&amp;amp;userId=48754</description>
    <dc:creator>Marco Thiel</dc:creator>
    <dc:date>2016-02-09T02:05:38Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/315748">
    <title>Programming the world with Arduino and Wolfram Language</title>
    <link>https://community.wolfram.com/groups/-/m/t/315748</link>
    <description>&amp;amp;[Wolfram Notebook][1]&#xD;
&#xD;
&#xD;
  [1]: https://www.wolframcloud.com/obj/4d655a10-832e-4ace-b937-6d5a702289a3</description>
    <dc:creator>Ian Johnson</dc:creator>
    <dc:date>2014-08-10T17:56:06Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/456947">
    <title>How to Make a Time Lapse Video With Your Raspberry Pi and Data Drop</title>
    <link>https://community.wolfram.com/groups/-/m/t/456947</link>
    <description>![Wolfram Pi Flowers][9]&#xD;
&#xD;
&#xD;
----------&#xD;
&#xD;
I will explain how to make the time-lapse animation you can see above.&#xD;
&#xD;
**1)** Set-up your [camera module][1]. I sticked mine on a hard drive, see the first image on [my previous Data Drop post][2].&#xD;
&#xD;
![setup][3]&#xD;
&#xD;
**2)** Take a test shot to check that the exposure is acceptable.&#xD;
&#xD;
    DeviceRead[&amp;#034;RaspiCam&amp;#034;,{320, 240}]&#xD;
&#xD;
**3)** Adjust the resulting image with [ImageAdjust][4].&#xD;
&#xD;
    ImageAdjust[DeviceRead[&amp;#034;RaspiCam&amp;#034;,{320, 240}]]&#xD;
![Test shot][5]&#xD;
&#xD;
**4)** Create a new databin, and take note of its short ID:&#xD;
&#xD;
    CloudConnect[&amp;#034;email-wolframID&amp;#034;,&amp;#034;password&amp;#034; ];&#xD;
    bin=CreateDatabin[];&#xD;
    bin[&amp;#034;ShortID&amp;#034;]&#xD;
&amp;#034;3GgU-jf4&amp;#034;&#xD;
&#xD;
**5)** Setup a [ScheduledTask][6] that adds a snapshot to your databin every 360 seconds (6minutes):&#xD;
&#xD;
    intervalometer=RunScheduledTask[DatabinAdd[Databin[&amp;#034;3GgU-jf4&amp;#034;], ImageAdjust[DeviceRead[&amp;#034;RaspiCam&amp;#034;,{320, 240}]]],360]&#xD;
&#xD;
**6)** Water your plant and wait.&#xD;
&#xD;
**7)** Check that your databin is being filled correctly [http://wolfr.am/3GgU-jf4][7]&#xD;
&#xD;
![databin][8]&#xD;
&#xD;
**8)** Compile the animated gif:&#xD;
&#xD;
    frames = Values[Databin[&amp;#034;3GgU-jf4&amp;#034;]]; &#xD;
    Export[&amp;#034;resurrected_plant.gif&amp;#034;, Join[frames, Reverse[frames]]]&#xD;
&#xD;
**9)** Enjoy!&#xD;
&#xD;
![Wolfram Pi Flowers][9]&#xD;
&#xD;
**10)** To stop your scheduled task, use the function [StopScheduledTask][10]:&#xD;
&#xD;
    StopScheduledTask[intervalometer]&#xD;
&#xD;
&#xD;
  [1]: http://community.wolfram.com/groups/-/m/t/157704&#xD;
  [2]: http://community.wolfram.com/groups/-/m/t/453169&#xD;
  [3]: /c/portal/getImageAttachment?filename=setupPlant.png&amp;amp;userId=56204&#xD;
  [4]: http://reference.wolfram.com/language/ref/ImageAdjust.html&#xD;
  [5]: /c/portal/getImageAttachment?filename=FlowerFrames.jpg&amp;amp;userId=56204&#xD;
  [6]: http://reference.wolfram.com/language/ref/RunScheduledTask.html&#xD;
  [7]: http://wolfr.am/3GgU-jf4&#xD;
  [8]: /c/portal/getImageAttachment?filename=databin_filled.png&amp;amp;userId=56204&#xD;
  [9]: /c/portal/getImageAttachment?filename=Wplant.gif&amp;amp;userId=56204&#xD;
  [10]: http://reference.wolfram.com/language/ref/StopScheduledTask.html</description>
    <dc:creator>Bernat Espigulé</dc:creator>
    <dc:date>2015-03-11T11:50:05Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/196759">
    <title>Reading Temperature Sensors in the Wolfram Language on the RPi</title>
    <link>https://community.wolfram.com/groups/-/m/t/196759</link>
    <description>These sensors are pretty cool--they are [url=http://www.adafruit.com/products/374]cheap to buy[/url] and surprisingly sensitive to small changes in temperature. Here&amp;#039;s a first attempt I made to interact with the sensors in the Wolfram Language.&#xD;
&#xD;
For this setup I used DS18B20 temperature sensors and hooked them up to the Raspberry Pi breadboard according to Adafruit&amp;#039;s [url=http://learn.adafruit.com/adafruits-raspberry-pi-lesson-11-ds18b20-temperature-sensing/overview]setup guide[/url]. The board should look like the following diagram (make sure the sensor is hooked up to a 3.3V pin--not a 5V pin or you could fry the sensor):&#xD;
&#xD;
[center][img=width: 300px; height: 481px;]https://learn.adafruit.com/system/assets/assets/000/003/775/medium800/learn_raspberry_pi_summary.jpg[/img][/center]&#xD;
Once hooked up and connected to your Pi, run the following commands in the terminal:&#xD;
&#xD;
[code]sudo modprobe w1-gpio&#xD;
sudo modprobe w1-therm[/code]&#xD;
The temperatures are read from the sensor by &amp;#034;reading&amp;#034; the file that&amp;#039;s created in the devices directory. You can locate the file with the following commands:&#xD;
&#xD;
[code]cd /sys/bus/w1/devices&#xD;
ls[/code]&#xD;
This will show you the contents of your devices folder, where there should be a file titled 28-xxxx, where the xxxx is the serial number unique to your sensor. Once you&amp;#039;ve got that number, enter:&#xD;
&#xD;
[code]cd 28-xxxx (the xxxx should be replaced with the serial number unique to your sensor)&#xD;
cat w1_slave[/code]&#xD;
Two lines of data should return back to you--if the first line ends with &amp;#034;YES&amp;#034; then the 5-digit number at the end of the second line is the temperature, to be read as xx.xxx degrees Celsius.&#xD;
&#xD;
And now that we know that the temperature sensor is working, and we know how to find it, we can copy the file path and import it using the Wolfram Language.&#xD;
&#xD;
[mcode]Import[&amp;#034;/sys/bus/w1/devices/28-000004fe0343/w1_slave&amp;#034;][/mcode]&#xD;
Since this still returns a really long string of data that we don&amp;#039;t need, we can single out the temperature and then convert the string into a computable expression.&#xD;
&#xD;
[mcode]temp:=N[ToExpression[StringTake[Import[&amp;#034;/sys/bus/w1/devices/28-000004fe0343/w1_slave&amp;#034;],-5]]/1000][/mcode]&#xD;
So now when we read the file, we just get the temperature back!&#xD;
&#xD;
[mcode]temp&#xD;
(*22.312*)&#xD;
[/mcode]&#xD;
For kicks, I set up a scheduled task to plot the ambient temperature of my office every 60 seconds for 6 hours. Unsurprisingly, the temperature only fluctuated a few tenths of a degree...!&#xD;
&#xD;
[mcode]t={}&#xD;
RunScheduledTask[(deg=temp;AppendTo[t,deg]),{60,360}];&#xD;
Dynamic[ListLinePlot[t,Joined-&amp;gt;True,PlotRange-&amp;gt;Automatic]]&#xD;
[/mcode]&#xD;
And here&amp;#039;s what the graph looked like after a little bit of time--it truly is a sensitive device (the &amp;#034;large&amp;#034; dip down to 22.1 was me touching the sensor with my cold hands!):&#xD;
&#xD;
[center][img=width: 360px; height: 228px;]/c/portal/getImageAttachment?filename=temperaturereading3.jpg&amp;amp;userId=108162[/img][/center]Any suggestions for what to do next?</description>
    <dc:creator>Allison Taylor</dc:creator>
    <dc:date>2014-02-06T21:04:23Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/250923">
    <title>Detect radioactivity at home with Mathematica</title>
    <link>https://community.wolfram.com/groups/-/m/t/250923</link>
    <description>I thought I would share the following rather simple experiment. It is based on a Geiger counter which interacts via the serial port with Mathematica. It can be used to measure ambient levels of radioactivity and to test different radioactive sources at home. As an additional benefit it generates &amp;#034;perfect/physical&amp;#034; random numbers (as opposed to pseudorandom numbers), some of which I attach to this post. 

The setup is rather simple, unfortunately one of the parts is a bit expensive. I use

1) [url=https://www.sparkfun.com/products/11345]Sparkfun&amp;#039;s Geiger Counter SEN-11345[/url]. (about 150 $ and similar in £)
2) A USB 2.0 A/Mini-B cable.

[img=width: 400px; height: 300px;]/c/portal/getImageAttachment?filename=photo5.JPG&amp;amp;userId=48754[/img]

The Geiger tube is on the bottem right on the photo. The parts inside the dashed line are under high voltage when the thing is switched on; avoid touching any of these parts. The Geiger counter has an onboard AMTEL chip (like Arduino) so it communicate easily via the serial port. You might have to install the appropriate VCP driver [url=http://www.ftdichip.com/Drivers/VCP.htm]from this website[/url].

Once the driver is installed you connect the Geiger counter to the USB port of the computer. On the software side you need to install the [url=http://library.wolfram.com/infocenter/Demos/5726/]SerialIO package[/url]. Step by step instructions for the installation can be found on [url=http://williamjturkel.net/2011/12/25/connecting-arduino-to-mathematica-on-mac-os-x-with-serialio/]this website[/url].

The Sparkfun Geiger counter sends a sequence of 0 and 1 to the serial port, which indicate whether the previous interval is longer or shorter than the current interval between counts. Each number corresponds to one event - the counter measures alpha, beta and gamma radiation; the latter only if the red protective cap is removed. The sequence of zeros and ones can be used as a set of independent random numbers. For most simulations the pseudorandom numbers that computers (and Mathematica) generate, but for some other applications such as encryption physical random numbers are preferable. The code below will produce that string of random digits, but it will also record the times of the measured decay events. The code is for Mac OSX but can easily be adapted for other operating systems.

[mcode](*Clear everything*)
ClearAll[&amp;#034;Global`*&amp;#034;]

(*Load SerialIO*)
&amp;lt;&amp;lt; SerialIO`

(*Connect to the Geiger counter*)
myGeigerCounter = 
SerialOpen[Quiet[FileNames[&amp;#034;tty.usb*&amp;#034;, {&amp;#034;/dev&amp;#034;}, Infinity]][[1]]];
SerialSetOptions[myGeigerCounter, &amp;#034;BaudRate&amp;#034; -&amp;gt; 9600];
While[SerialReadyQ[myGeigerCounter] == False, Pause[0.1]];

(*Measure*)
s = {}; tZero = AbsoluteTime[]; SerialRead[myGeigerCounter]; 
Dynamic[
Refresh[
AppendTo[s, {SerialRead[myGeigerCounter], AbsoluteTime[] - tZero}];, 
UpdateInterval -&amp;gt; 0.002], 
SynchronousUpdating -&amp;gt; False]
[/mcode]
That is basically the entire code for reading the data. The critical point is the UpdateInterval. If the computer is busy, it might measure too slowly and measure a sequence of 0s and 1s instead of a single digit, so it is important to have few background processes running. The UpdateInterval of 0.002 has shown to be sufficiently high for my experiments; Sparkfun&amp;#039;s detector can measure up to 100 Hz. Also, if there is no event for 10 seconds the program will read &amp;#034; &amp;#034; and record the time, i.e. last event plus 10 seconds. Therefore we need to clean up the data after the measurement is finished (simply abort the evaluation), see below. If you want to plot the measurements dynamically you can use: 

[mcode]Dynamic[sc = DeleteCases[s, {&amp;#034;&amp;#034;, ___}]; 
 ListPlot[sc[[2 ;;, 2]] - sc[[1 ;; -2, 2]],
  PlotRange -&amp;gt; {All, {0, Max[sc[2 ;;, 2]] - sc[[1 ;; -2, 2]]}}, 
  ImageSize -&amp;gt; Large]][/mcode]
This gives the following animation, where every couple of seconds a new point is added.

[img=width: 400px; height: 240px;]/c/portal/getImageAttachment?filename=Decay-movie.gif&amp;amp;userId=48754[/img]

If you are interested in an estimate of the &amp;#034;counts per minute&amp;#034; this command will do the trick:

[mcode]Dynamic[10/(sc[[-1, 2]] - sc[[-10, 2]])*60. // N][/mcode]
So by now we have a working Geiger counter and we can visualise the results. 

To test the counter I ran a measurement of the background radiation for a couple of hours. I import the data:

[mcode]tstest = Import[&amp;#034;~/Desktop/Geigerdata_overnight.csv&amp;#034;];
[/mcode]
 Then clean it up

[mcode]tsBackground = DeleteCases[# &amp;amp; /@ tstest, {&amp;#034;&amp;#034;, ___}] // ToExpression[/mcode]
and plot a diagram of the time between two consecutive events

[mcode]ListPlot[tsBackground[[2 ;; -1, 2]] - tsBackground[[1 ;; -2, 2]], 
 PlotRange -&amp;gt; All, AxesLabel -&amp;gt; {&amp;#034;Decay Incidence&amp;#034;, &amp;#034;Waiting Time&amp;#034;}, 
 ImageSize -&amp;gt; Large][/mcode]
[img=width: 576px; height: 320px;]/c/portal/getImageAttachment?filename=RadioactivityAnalysis-BackgroundWT.jpg&amp;amp;userId=48754[/img]

There are sligthly more than 12000 events in this time series.
To compare that to some radioactive object I dismantled a smoke detector and extracted the radioactive Americanum 241 container. The sourse emits alpha and gamma particles and is supposed to have a rate of about 33000 Bq, i.e. decays per second. (It says 0.9uCi on the detector. Wolfram Alpha knows that that is 33300 Bq.)

[img=width: 400px; height: 300px;]/c/portal/getImageAttachment?filename=photo6.JPG&amp;amp;userId=48754[/img] 

I position the Americanum at about 3 cm from the tip of the Geiger counter.

[img=width: 400px; height: 300px;]/c/portal/getImageAttachment?filename=photo7.JPG&amp;amp;userId=48754[/img]

I then measure the counts over a couple of hours and obtain:

[img=width: 576px; height: 320px;]/c/portal/getImageAttachment?filename=RadioactivityAnalysis-AmericanumWT.jpg&amp;amp;userId=48754[/img]
This time there are more then 20000 events recorded. Note that the waiting times are much shorter than in the background case. We can now compare the histograms of the waiting time between two consecutive events:

[img=width: 576px; height: 322px;]/c/portal/getImageAttachment?filename=RadioactivityAnalysisHistogram.jpg&amp;amp;userId=48754[/img]

The total number of events are not the same, but it is obvious that the waiting times are much shorter for the Americanum element than for the background (blue).
The waiting time distribution is expected to be exponentially distributed, so we can fit 
[mcode]distBackground = 
 EstimatedDistribution[
  tsBackground[[2 ;; -1, 2]] - tsBackground[[1 ;; -2, 2]], 
  ExponentialDistribution[\[Mu]]][/mcode]which gives
[quote]ExponentialDistribution[0.302391][/quote]and 
[mcode]distAmericanum = 
 EstimatedDistribution[
  tsAmericanum[[2 ;; -1, 2]] - tsAmericanum[[1 ;; -2, 2]], 
  ExponentialDistribution[\[Mu]]][/mcode]which gives
[quote]ExponentialDistribution[1.15882][/quote]
The mean waiting time for the background is
[mcode]Mean[distBackground]

(*3.30698*)[/mcode]and for the Americanum element
[mcode]Mean[distAmericanum]

(*0.862946*)[/mcode]Here is another representation
[mcode]Show[{ListLinePlot[
   Transpose[{HistogramList[
        tsBackground[[2 ;; -1, 2]] - tsBackground[[1 ;; -2, 2]]][[
       1]][[;; -2]], 
     HistogramList[
         tsBackground[[2 ;; -1, 2]] - tsBackground[[1 ;; -2, 2]]][[
        2]] // Log // N}], Mesh -&amp;gt; All, 
   AxesLabel -&amp;gt; {&amp;#034;waiting time&amp;#034;, &amp;#034;Log number&amp;#034;}, ImageSize -&amp;gt; Large], 
  ListLinePlot[
   Transpose[{HistogramList[
        tsAmericanum[[2 ;; -1, 2]] - tsAmericanum[[1 ;; -2, 2]]][[
       1]][[;; -2]], 
     HistogramList[
         tsAmericanum[[2 ;; -1, 2]] - tsAmericanum[[1 ;; -2, 2]]][[
        2]] // Log // N}], PlotStyle -&amp;gt; Red, Mesh -&amp;gt; All]}][/mcode]
[img=width: 576px; height: 335px;]/c/portal/getImageAttachment?filename=RadioactivityAnalysis-Histogram2.jpg&amp;amp;userId=48754[/img]

Wolfram Alpha knows a lot of useful stuff about Americanum 241, try:[mcode]== AM 241[/mcode]
One little issue with the measurement is that the Americanum element is supposed to have 33000 Bq, i.e. decays per second. Our measurement, corrected for background radiation, shows that there is more or less one additional decay per second. When the measurement was conducted, the red cap at the tip of the Geiger counter was not removed; this filters out the alpha radiation. I read somewhere that only less than 10% of the events come from gamma radiation, even though that seems incorrect given the information on [url=http://en.wikipedia.org/wiki/Americium]wikipedia[/url]. Also the detector was at a distance of about 3 cm from the element. Its opening is about 1cm^2. That might account for another factor of 100, but we would still be much below the at least 33 events per second. I have not really looked into this or done any calculations, but it might be nice to find the missing events.

This does not seem to be a problem with the 0.002 Updateinterval, as visual inspection of the time series shows that there are nearly no incidence of two bits being read at the same time from the serial port. The Geiger conter can measure up to 100Hz. The counts per minute of the background radiation are about 20 which is reasonable. (The measurement was made in Aberdeen (UK) where many houses are made from Granite which is slightly radioactive. Inside the hoses Radon is produced which can sometimes yield very high levels of radiation. This particular measurement was not taken in a Granite house.) 

If anyone cares about good random numbers, they can use

[mcode]tsAmericanum[[All, 1]][/mcode]
To extract a string of ones ans zeros, which are &amp;#034;real&amp;#034; random numbers. I attach a file with the sequence of ones and zeros I got from the Americanum experiment. You can also use that to analyse how many times our update interval was too slow so that two digits were transmitted at once. The file can be read with:
[mcode]data = Import[&amp;#034;~/Desktop/RandomAmericanum.csv&amp;#034;] // Flatten[/mcode]
A simple Tally shows that 
[mcode]Tally[data]

(*{{0, 11779}, {1, 11979}, {11, 65}, {10, 1}}*)[/mcode]Only 66 cases out of about 22000 were double digits.

Much about the code can/should be improved. Using an intermediate Arduino might improve the measurements slightly.

M.

PS: Note that there might be regulations in your country regarding the Americanum element of the smoke detector, e.g. how to dispose of them. Also Sparkfun does not deliver the Geiger Counter to all countries. </description>
    <dc:creator>Marco Thiel</dc:creator>
    <dc:date>2014-05-12T21:33:18Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/23261">
    <title>How do I connect Arduino to Mathematica kernel?</title>
    <link>https://community.wolfram.com/groups/-/m/t/23261</link>
    <description>I&amp;#039;ve recently started having fun with Arduino (see [b][url=http://www.arduino.cc/]www.arduino.com[/url][/b] for details).  The memory and processing speed is far below what is necessary to run the Mathematica kernel, but I wonder if an Arduino program might somehow make calls to a Mathematica kernel (running on a network-connected machine) and get results back.  Has anyone done this?</description>
    <dc:creator>David DeBrota</dc:creator>
    <dc:date>2012-10-19T12:17:10Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/1026495">
    <title>Coffee &amp;amp; milk with Arduino and Newton&amp;#039;s law of cooling</title>
    <link>https://community.wolfram.com/groups/-/m/t/1026495</link>
    <description>Here is something that I do in my lectures regarding the coffee cooling problem. I use just standard Mathematica and a rather simple approach based on Newton&amp;#039;s law for cooling. See also a related [System Modeler post][1].&#xD;
&#xD;
Newton&amp;#039;s Law of Cooling&#xD;
-----------------------&#xD;
&#xD;
When Newton described the cooling for an object, he used the simple assumption that the change of the temperature of an object is proportional to the temperature difference of the environment and the object.&#xD;
&#xD;
![enter image description here][2]&#xD;
&#xD;
Here Temp(t) is the temperature of the object and TU is the temperature of the environment, e.g. the ambient air. The coefficient $\kappa$ indicates how fast the adaptation to the outside temperature takes place. &#xD;
&#xD;
In Mathematica it looks like this:&#xD;
&#xD;
    sols = DSolve[{D[Temp[t], t] == -\[Kappa] (Temp[t] - TU), Temp[0] == T0}, Temp, t]&#xD;
&#xD;
Here we assume that the starting temperature is T0. If we assume the starting temperature to be 80 degrees C and the air temperature to be 20 degrees C, and $\kappa=0.1$ we obtain the following plot.&#xD;
&#xD;
    Plot[Temp[t] /. sols /. {\[Kappa] -&amp;gt; 0.1, TU -&amp;gt; 20, T0 -&amp;gt; 80}, {t, 0, 60}, PlotRange -&amp;gt; {All, {0, 100}}, &#xD;
    AxesLabel -&amp;gt; {&amp;#034;Time&amp;#034;, &amp;#034;Temperature&amp;#034;}, LabelStyle -&amp;gt; Directive[Bold, Medium]]&#xD;
&#xD;
![enter image description here][3]&#xD;
&#xD;
In fact, if we had data we could determine the value for $\kappa$ from experiments. &#xD;
&#xD;
When do I put the milk in my coffee?&#xD;
------------------------------------&#xD;
&#xD;
&#xD;
So here&amp;#039;s the question. If I am in a hurry in the morning and need to catch a bus. I love my coffee but do not want to burn my lips. When do I pour the cold milk in? As early on as I can or just before I need to leave the house?&#xD;
&#xD;
We can easily model the adding of milk to some coffee by using the WhenEvent function:&#xD;
&#xD;
    sols = NDSolve[{D[Temp[t], t] == -0.1 (Temp[t] - 20), Temp[0] == 80, WhenEvent[t &amp;gt; 1, Temp[t] -&amp;gt; (200. Temp[t] + 50.* 5.)/250.]}, Temp, {t, 0, 30}]&#xD;
&#xD;
it uses a simplified version of the law that governs the mixing of fluids (same calorific capacities).&#xD;
&#xD;
![enter image description here][4]&#xD;
&#xD;
where Tmix is the temperature of the resulting coffee-milk mix, mc and mm are the mass of the coffee and the milk, and Tc and Tm their respective temperatures. Note that the model is not quite correct in the sense that it does not take into consideration that the mass of the content of the mug has changed after the milk is added. First we put the milk in right at the beginning:&#xD;
&#xD;
    sols = NDSolve[{D[Temp[t], t] == -0.1 (Temp[t] - 20), Temp[0] == 80, &#xD;
       WhenEvent[t &amp;gt; 1, Temp[t] -&amp;gt; (200. Temp[t] + 50.* 5.)/250.]}, &#xD;
      Temp, {t, 0, 30}]&#xD;
&#xD;
Here&amp;#039;s the graph.&#xD;
&#xD;
    Plot[Temp[t] /. sols, {t, 0, 30}, PlotRange -&amp;gt; All, AxesLabel -&amp;gt; {&amp;#034;Time&amp;#034;, &amp;#034;Temperature&amp;#034;}, LabelStyle -&amp;gt; Directive[Bold, Medium]]&#xD;
&#xD;
![enter image description here][5]&#xD;
&#xD;
Next we wait for 10 minutes and then pour the milk in:&#xD;
&#xD;
    sols2 = NDSolve[{D[Temp[t], t] == -0.1 (Temp[t] - 20), Temp[0] == 80, WhenEvent[t &amp;gt; 10, Temp[t] -&amp;gt; (200.* Temp[t] + 50.* 5.)/250.]}, Temp, {t, 0, 30}]&#xD;
&#xD;
Here&amp;#039;s the plot:&#xD;
&#xD;
    Plot[Temp[t] /. sols2, {t, 0, 30}, PlotRange -&amp;gt; All, AxesLabel -&amp;gt; {&amp;#034;Time&amp;#034;, &amp;#034;Temperature&amp;#034;}, &#xD;
     LabelStyle -&amp;gt; Directive[Bold, Medium]]&#xD;
&#xD;
![enter image description here][6]&#xD;
&#xD;
If we plot both together we get this:&#xD;
&#xD;
    Show[Plot[Temp[t] /. sols, {t, 0, 30}, PlotRange -&amp;gt; All], Plot[Temp[t] /. sols2, {t, 0, 30}, PlotRange -&amp;gt; All], &#xD;
    AxesLabel -&amp;gt; {&amp;#034;Time&amp;#034;, &amp;#034;Temperature&amp;#034;}, LabelStyle -&amp;gt; Directive[Bold, Medium]]&#xD;
&#xD;
![enter image description here][7]&#xD;
&#xD;
Clearly, I should put my milk in as late as possible. Well, that&amp;#039;s the model. &#xD;
&#xD;
Getting data for our model&#xD;
--------------------------&#xD;
&#xD;
&#xD;
Let&amp;#039;s get some data with an Arduino and a simple temperature sensor. We use a standard Arduino Uno and wire it like so:&#xD;
&#xD;
![enter image description here][8]&#xD;
&#xD;
I use a standard Arduino and a [Dallas Temperature Sensor][9]. We then need to upload an Arduino C-program, which is called Sketch, to the microcontroller. &#xD;
&#xD;
    #include &amp;lt;OneWire.h&amp;gt;&#xD;
    #include &amp;lt;DallasTemperature.h&amp;gt;&#xD;
     &#xD;
    // Data wire is plugged into pin 2 on the Arduino&#xD;
    #define ONE_WIRE_BUS 2&#xD;
     &#xD;
    // Setup a oneWire instance to communicate with any OneWire devices &#xD;
    // (not just Maxim/Dallas temperature ICs)&#xD;
    OneWire oneWire(ONE_WIRE_BUS);&#xD;
     &#xD;
    // Pass our oneWire reference to Dallas Temperature.&#xD;
    DallasTemperature sensors(&amp;amp;oneWire);&#xD;
     &#xD;
    void setup(void)&#xD;
    {&#xD;
      // start serial port&#xD;
      Serial.begin(9600);&#xD;
    &#xD;
      // Start up the library&#xD;
      sensors.begin();&#xD;
    }&#xD;
     &#xD;
     &#xD;
    void loop(void)&#xD;
    {&#xD;
     &#xD;
      sensors.requestTemperatures(); // Send the command to get temperatures&#xD;
    &#xD;
      Serial.print(sensors.getTempCByIndex(0),4); &#xD;
      Serial.print(&amp;#034;,&amp;#034;);&#xD;
    }&#xD;
&#xD;
The upload is done via the Arduino IDE. (I could do it directly from within Mathematica.)&#xD;
&#xD;
![enter image description here][10]&#xD;
&#xD;
The Arduino is connected via a serial (i.e. USB) cable to the computer. Mathematica can connect to the serial port like so (on a Mac, slightly different on a Windows machine):&#xD;
&#xD;
    myDSTemperature = &#xD;
     DeviceOpen[&amp;#034;Serial&amp;#034;, {Quiet[FileNames[&amp;#034;tty.usb*&amp;#034;, {&amp;#034;/dev&amp;#034;}, Infinity]][[1]], &amp;#034;BaudRate&amp;#034; -&amp;gt; 9600}]&#xD;
&#xD;
We can then run a ScheduledTask every second:&#xD;
&#xD;
    s = {}; tZero = AbsoluteTime[]; DeviceReadBuffer[myDSTemperature];&#xD;
    RunScheduledTask[&#xD;
    AppendTo[s, {AbsoluteTime[] - tZero, ToExpression@(StringSplit[FromCharacterCode@DeviceReadBuffer[myDSTemperature], &amp;#034;,&amp;#034;][[-1]])}], 1]&#xD;
&#xD;
We can then display the result dynamically:&#xD;
&#xD;
    Dynamic[ListLinePlot[s, PlotRange -&amp;gt; {All, {20, 40}}]]&#xD;
&#xD;
After that you might want to stop the Scheduled task and close the connection:&#xD;
&#xD;
    RemoveScheduledTask /@ ScheduledTasks[];&#xD;
    DeviceClose[myDSTemperature]&#xD;
&#xD;
I have attached the measurements I obtained for my experiment. You can load and display it like this:&#xD;
&#xD;
    ClearAll[&amp;#034;Global`*&amp;#034;]&#xD;
    &#xD;
    data = Import[&amp;#034;~/Desktop/temperaturedata_CoffeMilkGood.csv&amp;#034;];&#xD;
    dataclean = Table[Join[{data[[k]][[1 ;; 6]]}, data[[k]][[7 ;; 9]]], {k, 1, Length[data]}];&#xD;
    data2plotearlymilk = Select[Transpose[{dataclean[[All, 1]], dataclean[[All, 3]]/1000.}], #[[2]] &amp;gt; 0 &amp;amp;];&#xD;
    data2plotlatemilk = Select[Transpose[{dataclean[[All, 1]], dataclean[[All, 2]]/1000.}], #[[2]] &amp;gt; 0 &amp;amp;];&#xD;
    &#xD;
    DateListPlot[{data2plotearlymilk, data2plotlatemilk}, PlotRange -&amp;gt; {All, {0, 100}}, Joined -&amp;gt; True, &#xD;
    FrameLabel -&amp;gt; {&amp;#034;Time&amp;#034;, &amp;#034;Temperature&amp;#034;}, LabelStyle -&amp;gt; Directive[Bold, Medium]]&#xD;
&#xD;
![enter image description here][11]&#xD;
&#xD;
There is clearly one measurement error at the blue curve, but it does appear to show that there is a better chance of not burning my lips if I put the milk in as late as I can.&#xD;
&#xD;
Note that I have assumed that the air temperature does not change. In fact, I also measured the air temperature when I did the experiment - it is constant as expected. &#xD;
&#xD;
As I said, I know that this is not really addressing the original question, but when I teach mathematical modelling, I always like showing the students that is quite relevant to compare the models to data.&#xD;
&#xD;
Cheers,&#xD;
&#xD;
Marco&#xD;
&#xD;
&#xD;
  [1]: http://community.wolfram.com/groups/-/m/t/1021383&#xD;
  [2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-02-28at15.21.22.png&amp;amp;userId=48754&#xD;
  [3]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-02-28at15.29.35.png&amp;amp;userId=48754&#xD;
  [4]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-02-28at15.38.21.png&amp;amp;userId=48754&#xD;
  [5]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-02-28at15.41.25.png&amp;amp;userId=48754&#xD;
  [6]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-02-28at15.42.59.png&amp;amp;userId=48754&#xD;
  [7]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-02-28at15.44.18.png&amp;amp;userId=48754&#xD;
  [8]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-02-28at15.31.12.png&amp;amp;userId=48754&#xD;
  [9]: https://www.amazon.co.uk/gp/product/B00CHEZ250/&#xD;
  [10]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-02-28at15.34.45.png&amp;amp;userId=48754&#xD;
  [11]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-02-28at15.51.11.png&amp;amp;userId=48754</description>
    <dc:creator>Marco Thiel</dc:creator>
    <dc:date>2017-03-07T00:34:37Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/615905">
    <title>Connecting ROS (and a Parrot ArDrone 2.0) to the Wolfram Language</title>
    <link>https://community.wolfram.com/groups/-/m/t/615905</link>
    <description>**Connecting ROS to the Wolfram Language**&#xD;
======================================&#xD;
*Or Controlling a Parrot ArDrone 2.0 from Mathematica*&#xD;
----------------------------------------------------&#xD;
&#xD;
![enter image description here][1]&#xD;
&#xD;
----------&#xD;
Foreword&#xD;
--------&#xD;
During this summer, between the end of my high school education and the beginning of my undergraduate studies, I had the opportunity to participate in the Wolfram Mentorship program where I worked on a project called &amp;#034;Connecting ROS to the Wolfram Language&amp;#034;. &#xD;
&#xD;
[ROS][2], which is an abbreviation for &amp;#034;Robot Operating System&amp;#034;, is a software available for Linux which acts as a &amp;#039;conductor&amp;#039; between different electronic components of one or more robots (such as the engine, the cameras...) such that the components and the controller (a computer for example) can communicate together. A good introduction to ROS can be found [here][3] and in depth tutorials are available [there][4].&#xD;
&#xD;
The aim of this project was to implement a connection to ROS from the Wolfram Language, i.e. to be able to collect and interpret data from the robots controlled by ROS in Mathematica as well as to control ROS and the robots from Mathematica and to test the results using a Parrot ArDrone 2.0. &#xD;
&#xD;
This post, which marks the end of the project, describes the different steps I went through to complete this project. It assumes prior knowledge of ROS (and the [ardrone autonomy package][5]) as well as basic Linux and Mathematica. It is divided into two parts:&#xD;
&#xD;
 1.  &amp;#039;collecting and interpreting data&amp;#039; in Mathematica&#xD;
 2. &amp;#039;controlling ROS and supported Robots&amp;#039; from Mathematica, &#xD;
&#xD;
These are independent and throughout the first part code developed is illustrated using turtlesim (a simulator of a turtle - used in ROS tutorials) and throughout the second part, the examples refer to the Parrot ArDrone 2.0.&#xD;
&#xD;
----------&#xD;
&#xD;
&#xD;
PART 1: COLLECTING &amp;amp; INTERPRETING DATA&#xD;
--------------------------------------&#xD;
Extracting and Understanding Data from ROS&#xD;
------------------------------------------&#xD;
After installing ROS and getting familiar with the environment, the first step is to figure out how to extract data from it, or to be more precise, how to log all the messages published over one or more topic in a file which can be retrieved by the user. &#xD;
&#xD;
Although, there is a tool called a rosbag which records the selected ROS topics and store them in a .bag file, it turns out that I only managed to open these files in ROS so a different procedure was used. It consists of echoing the messages published on selected topics (using the command rostopic echo) to a file of the chosen format (since messages are just simple text files the format used will be .txt). This can simply be done by executing the following command in the Linux terminal before sending messages over the topic(s) (once ROS is installed and started):&#xD;
&#xD;
    rostopic echo [topic] &amp;gt; [file].txt &#xD;
&#xD;
For example, if we were going to extract the velocity commands sent to the turtle to a file called rosdata.txt, the command would be: &#xD;
&#xD;
    rostopic echo /turtle/cmd_vel &amp;gt; rosdata.txt &#xD;
&#xD;
And opening the text file (once the ROS session is ended) would return a sequence of messages looking like the one below (the values of x,y,z might change):&#xD;
&#xD;
    linear&#xD;
    x: 1.0&#xD;
    y: 0.0&#xD;
    z: 0.0&#xD;
    angular&#xD;
    x: 0.0&#xD;
    y: 0.0&#xD;
    z: 1.0&#xD;
    _ _ _ &#xD;
&#xD;
Now, In order to &amp;#039;use&amp;#039; the data, it is  necessary to understand the structure of the text file. It can be represented by the &amp;#039;fomula&amp;#039; below :&#xD;
&#xD;
    Header_1					(some indication about variables displayed beneath)&#xD;
    	Variable_1: value1		(a variable name followed by its value)&#xD;
    	....					&#xD;
    	Varable_n: value n		(a variable name followed by its value)	&#xD;
    ....&#xD;
    Header_n					(some indication about variables displayed beneath)&#xD;
    	Variable_1: value1		(a variable name followed by its value)&#xD;
    	....						&#xD;
    	Varable_n: value n		(a variable name followed by its value)	 &#xD;
    _ _ _  						(end of message)&#xD;
&#xD;
So, going back to the turtlesim example:&#xD;
&#xD;
 - linear x represents the linear velocity of the turtle (in m/s),&#xD;
 - angular z represents the angular velocity of the turtle in rad/s (its bearing being the sum of the angular displacements). &#xD;
 - The other variables are always 0 as the turtle moves in a 2D plane.&#xD;
&#xD;
Note: the meaning of a variable in a message might not always be obvious at first sight (due to ambiguous variable name such as x,y,z for instance) however, by observing the values of the variables against the action of the robots (the trajectory it takes in our case), this meaning can be quite easily retrieved. &#xD;
&#xD;
&#xD;
Importing the Data into Mathematica&#xD;
-----------------------------------&#xD;
&#xD;
Now that is possible to retrieve and understand messages published over a ROS topic the next step is to import those messages in Mathematica. In order to do this, a function, taking the path of the text file containing the messages as an argument and returning the messages in a list created; as displayed below:  &#xD;
&#xD;
    MessageImport[filepath_]:=&#xD;
    Module[&#xD;
    {stream=OpenRead[filepath],character,rules= {}},&#xD;
    character=Read[stream,String];&#xD;
    Reap[&#xD;
    While[character=!=EndOfFile,&#xD;
    If[character=== &amp;#034;---&amp;#034;,&#xD;
    Sow[Flatten[rules]];rules= {},&#xD;
    AppendTo[rules,&#xD;
    StringCases[character,&#xD;
    {var:(WordCharacter..)~~&amp;#034;:&amp;#034;~~Whitespace~~value:NumberString:&amp;gt;(var-&amp;gt;ToExpression[value]),&#xD;
    var:(WordCharacter..)~~&amp;#034;:&amp;#034;~~Whitespace:&amp;gt;(&amp;#034;Head&amp;#034;-&amp;gt;var)}]]];&#xD;
    character=Read[stream,String]];&#xD;
    Close[stream];][[2,1]]];&#xD;
&#xD;
And, it is then possible to represent this in a dataset like so: &#xD;
&#xD;
    Dataset[&#xD;
    Association[#]&amp;amp;/@&#xD;
    Partition[&#xD;
    Flatten[&#xD;
    MessageImport[filepath_]],4]]&#xD;
&#xD;
Hence, returning to the turtlesim example, if rosdata.txt is the file where the velocity command messages are published, the code below:&#xD;
&#xD;
    Dataset[&#xD;
     Association[#] &amp;amp; /@&#xD;
      Partition[&#xD;
       Flatten[&#xD;
        MessageImport[&amp;#034;rosdata.txt&amp;#034;]], 4]]&#xD;
&#xD;
would return something of the form:&#xD;
&#xD;
  ![enter image description here][6]&#xD;
&#xD;
&#xD;
Interpreting the Data&#xD;
---------------------&#xD;
&#xD;
Having imported the Data into Mathematica it can now be interpreted using the Wolfram Language. For example, in the case of a topic where the messages published contain linear and angular velocities like the one described above, the path of the robot can be retrieved using the following code (using the function AnglePath):  &#xD;
&#xD;
    Graphics[Line[AnglePath[Values[MessageImport[filepath_][[All,{2,8}]]]]]]&#xD;
&#xD;
Therefore,considering the turtlesim example, if the path of the turtle was:&#xD;
&#xD;
![enter image description here][7]&#xD;
&#xD;
The following code:&#xD;
&#xD;
    Graphics[Line[AnglePath[Values[MessageImport[&amp;#034;rosdata.txt&amp;#034;][[All,{2,8}]]]]]]&#xD;
&#xD;
would return:&#xD;
&#xD;
![enter image description here][8]&#xD;
&#xD;
which is the same as the actual path taken by the turtle.&#xD;
&#xD;
----------&#xD;
&#xD;
PART 2: CONTROLING ROS AND SUPPORTED ROBOTS&#xD;
-------------------------------------------&#xD;
Sending Commands to ROS from Mathematica&#xD;
----------------------------------------&#xD;
&#xD;
Yet another thing to consider is how to send ROS commands from Mathematica in order to be able to directly operate ROS and record ROS data from a Mathematica notebook. &#xD;
&#xD;
One possible way to do so would be to use the Run command, however, it is necessary to source ROS commands before executing them and I did not manage to do so with the Run command. &#xD;
&#xD;
So, another method to execute ROS commands from Mathematica is to create an empty shell script (using the chmod command in the Linux terminal)  and then one can write the desired commands in the script and execute it from Mathematica, thanks to the function below (which takes the desired commands as an argument) :&#xD;
&#xD;
    RosCommand[Command_]:=&#xD;
    WaitNext[&#xD;
    ParallelSubmit/@&#xD;
    Unevaluated[{&#xD;
    {WriteString[&#xD;
    OpenWrite[&amp;#034;script&amp;#034;],&#xD;
    &amp;#034;bash -c \&amp;#034;source /opt/ros/indigo/setup.bash &amp;amp;&amp;amp; source ~/catkin_ws/devel/setup.bash &amp;amp;&amp;amp; &amp;#034;,Command,&amp;#034;\&amp;#034;&amp;#034;],Close[&amp;#034;script&amp;#034;],&#xD;
    SetDirectory[&amp;#034;/&amp;#034;],&#xD;
    SetEnvironment[&amp;#034;LD_LIBRARY_PATH&amp;#034;-&amp;gt;&amp;#034;&amp;#034;],&#xD;
    RunProcess[&amp;#034;script&amp;#034;,&amp;#034;StandardOutput&amp;#034;],&#xD;
    SetEnvironment[&amp;#034;LD_LIBRARY_PATH&amp;#034;-&amp;gt;FileNameJoin[{$InstallationDirectory,&amp;#034;SystemFiles&amp;#034;,&amp;#034;Libraries&amp;#034;,$SystemID}]],&#xD;
    SetDirectory[&amp;#034;your working directory&amp;#034;]},&#xD;
    Pause[5]}]];&#xD;
&#xD;
Note: a second kernel is used in that code in order to kill the process running in the first one as some commands executed by the ROS command function would evaluate indefinitely.&#xD;
&#xD;
Hence, to start the ROS master, the command would be : &#xD;
&#xD;
    RosCommand[rosstart]&#xD;
&#xD;
&#xD;
Publishing messages from Mathematica&#xD;
------------------------------------&#xD;
&#xD;
Finally, another thing to implement in order to be able to perform most of ROS&amp;#039;s basic features from Mathematica is to be able to publish messages on topics. In some cases this can be done by using the ROS command $rostopic publish through the RosCommand function described in the previous section. &#xD;
&#xD;
This method works fine however there is quite a considerable latency between the moment command is evaluated and the moment the command is actually published (about 1-2 seconds). Hence, this way only works for topics where messages have to be published at a rate lower than 0.5 Hz, for example the take off and land commands for the drone, as shown below:&#xD;
&#xD;
    Takeoff:&#xD;
    RosCommand[&amp;#034;rostopic pub ardrone/takeoff std_msgs/Empty&amp;#034;]&#xD;
    &#xD;
    Landing:&#xD;
    RosCommand[&amp;#034;rostopic pub ardrone/land std_msgs/Empty&amp;#034;]&#xD;
&#xD;
For topics where the rate for which the messages have to be published at a rate greater than 0.5 Hz, a publisher ROS node (a &amp;#039;Mathematica&amp;#039; node), intaking the arguments from Mathematica should be created. This can be written in C++ and the arguments from Mathematica can be transferred by simple file I/O code; i.e. this process would have the following structure:&#xD;
&#xD;
 - Mathematica Notebook:&#xD;
     - Some functions would assign a value to the desired message variables (controller part)&#xD;
     - Some code would save the variable values to a text file at a chosen rate&#xD;
 - C++ code:&#xD;
     - Some code declaring the Publisher Node&#xD;
     - Some code importing the variable values from the text file&#xD;
     - Some code publishing these values at the chosen rate&#xD;
&#xD;
Note: the instructions to build the node are available [here][9].&#xD;
&#xD;
Hence, to send velocity commands to the ArDrone from Mathematica, the Mathematica Notebook could be :&#xD;
&#xD;
    CONTROL PART:&#xD;
    &#xD;
    Turn right:&#xD;
    {p = {1, 0, 0}; Pause[1]; p = {0, 0, 0}}&#xD;
    &#xD;
    Move forwards:&#xD;
    {p = {0, 0.2, 0}, Pause[1], p = {0, 0, 0}}&#xD;
    &#xD;
    Move Up:&#xD;
    {p = {0, 0, 0.1}, Pause[1], p = {0, 0, 0}}&#xD;
    &#xD;
    Move Down:&#xD;
    {p = {0, 0, -0.1}, Pause[1], p = {0, 0, 0}}&#xD;
    &#xD;
    &#xD;
    VARIABLE OUTPUT PART: &#xD;
    &#xD;
    Dynamic[&#xD;
    Refresh[&#xD;
    StringJoin[&amp;#034;b  &amp;#034;,&#xD;
    Flatten[{&#xD;
    If[Round[p[[1]],0.1]&amp;gt;=0,&#xD;
    PadRight[Characters[ToString[Round[p[[1]],0.1]]],3,&amp;#034;0&amp;#034;],&#xD;
    PadRight[Characters[ToString[Round[p[[1]],0.1]]],4,&amp;#034;0&amp;#034;]],&#xD;
    &amp;#034;  &amp;#034;,&#xD;
    If[Round[p[[2]],0.1]&amp;gt;=0,&#xD;
    PadRight[Characters[ToString[Round[p[[2]],0.1]]],3,&amp;#034;0&amp;#034;],&#xD;
    PadRight[Characters[ToString[Round[p[[2]],0.1]]],4,&amp;#034;0&amp;#034;]],&amp;#034;  &amp;#034;,&#xD;
    If[Round[p[[3]],0.1]&amp;gt;=0,&#xD;
    PadRight[Characters[ToString[Round[p[[3]],0.1]]],3,&amp;#034;0&amp;#034;],&#xD;
    PadRight[Characters[ToString[Round[p[[3]],0.1]]],4,&amp;#034;0&amp;#034;]]&#xD;
    }],&amp;#034;  e&amp;#034;]&amp;gt;&amp;gt;&amp;#034;file.txt&amp;#034;&#xD;
    ,UpdateInterval-&amp;gt;1]]&#xD;
&#xD;
where p={angluar_z, linear_x, linear_z} and:&#xD;
&#xD;
 - angular_z represents the angular velocity of the drone in the xy plane (in rad/s) (its bearing being the sum of the angular displacements),&#xD;
 - linear_x represents the linear velocity of the drone in the xy plane (in m/s),&#xD;
 - linear_z represents the velocity of the drone along the z axis (in m/s).&#xD;
&#xD;
Moreover, the C++ code could look like:&#xD;
&#xD;
    #include &amp;#034;ros/ros.h&amp;#034;&#xD;
    #include &amp;#034;geometry_msgs/Twist.h&amp;#034;&#xD;
    &#xD;
    #include &amp;lt;boost/lexical_cast.hpp&amp;gt;&#xD;
    #include &amp;lt;iostream&amp;gt;&#xD;
    #include &amp;lt;string&amp;gt;&#xD;
    #include &amp;lt;fstream&amp;gt;&#xD;
    #include &amp;lt;stdlib.h&amp;gt;&#xD;
    &#xD;
    using namespace std;&#xD;
    &#xD;
    int main(int argc, char **argv)&#xD;
    {&#xD;
    	ros::init(argc, argv, &amp;#034;mathematica&amp;#034;);&#xD;
    	ros::NodeHandle n;&#xD;
    	&#xD;
    	ros::Publisher chatter_pub = n.advertise&amp;lt;geometry_msgs::Twist&amp;gt;(&amp;#034;/cmd_vel&amp;#034;, 1000);&#xD;
    &#xD;
    	ros::Rate loop_rate(1.0);&#xD;
    &#xD;
    	int count = 0;&#xD;
    &#xD;
    	while (ros::ok())&#xD;
    	{&#xD;
    		//start&#xD;
    		ifstream infile( &amp;#034;file.txt&amp;#034; );&#xD;
    		double nb; &#xD;
    		double nb2;&#xD;
    		double nb3;&#xD;
    		string b;&#xD;
    		string e;&#xD;
    		(infile &amp;gt;&amp;gt; b &amp;gt;&amp;gt; nb &amp;gt;&amp;gt; nb2 &amp;gt;&amp;gt; nb3 &amp;gt;&amp;gt; e);&#xD;
    		//end&#xD;
    		&#xD;
    		geometry_msgs::Twist vel_msg;&#xD;
    &#xD;
     		vel_msg.linear.x = nb2;&#xD;
    		vel_msg.linear.y =0.0;&#xD;
       		vel_msg.linear.z = nb3;&#xD;
    &#xD;
       		vel_msg.angular.x =0.0;&#xD;
       		vel_msg.angular.y =0.0;&#xD;
       		vel_msg.angular.z = nb;&#xD;
    &#xD;
    	    	ROS_INFO(&amp;#034;[Random Walk] linear.x = %.2f, angular.z=%.2f\n&amp;#034;, vel_msg.linear.x, vel_msg.angular.z);&#xD;
    &#xD;
    	    	chatter_pub.publish(vel_msg);&#xD;
    &#xD;
        		ros::spinOnce();&#xD;
    &#xD;
        		loop_rate.sleep();&#xD;
        		++count;&#xD;
    	}&#xD;
    &#xD;
    	return 0;&#xD;
    }&#xD;
&#xD;
----------&#xD;
&#xD;
Conclusion&#xD;
----------&#xD;
Throughout the Mentorship program, I developed functions enabling me to fully interact with ROS and the robot(s) it is controlling from a Mathematica notebook. Owning a Parrot ArDrone myself, which is a device supported by ROS, I tested this in real conditions and the result can be seen below:&#xD;
&#xD;
![enter image description here][10]&#xD;
&#xD;
This is only a GIF, the full video is available [here][11]. &#xD;
&#xD;
All, in all, I have thoroughly enjoyed working on this project and would love to hear your feedback. I see it as a starting point towards using the many abilities of the Wolfram Language in order to improve the capabilities of ROS supported robots: I can already imagine myself using Mathematica to create an autopilot for my drone !&#xD;
&#xD;
Finally, I would also like to thanks Todd Rowland, Alison Kimball and Wolfram Research for the help and support provided throughout the mentorship program.&#xD;
&#xD;
Loris Gliner &#xD;
&#xD;
&#xD;
  [1]: http://community.wolfram.com//c/portal/getImageAttachment?filename=fig0.png&amp;amp;userId=553216&#xD;
  [2]: http://www.ros.org/&#xD;
  [3]: http://robohub.org/ros-101-intro-to-the-robot-operating-system/&#xD;
  [4]: http://wiki.ros.org/&#xD;
  [5]: http://wiki.ros.org/ardrone_autonomy&#xD;
  [6]: http://community.wolfram.com//c/portal/getImageAttachment?filename=fig1.PNG&amp;amp;userId=553216&#xD;
  [7]: http://community.wolfram.com//c/portal/getImageAttachment?filename=fig2.JPG&amp;amp;userId=553216&#xD;
  [8]: http://community.wolfram.com//c/portal/getImageAttachment?filename=fig3.png&amp;amp;userId=553216&#xD;
  [9]: http://wiki.ros.org/ROS/Tutorials/WritingPublisherSubscriber%28c++%29&#xD;
  [10]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ardronegif.gif&amp;amp;userId=553216&#xD;
  [11]: https://www.youtube.com/watch?v=zD8yjOCWhJQ</description>
    <dc:creator>Loris Gliner</dc:creator>
    <dc:date>2015-11-17T17:57:08Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/1057588">
    <title>Parallel Mathematica Environment on the RaspberryPi using OOP</title>
    <link>https://community.wolfram.com/groups/-/m/t/1057588</link>
    <description>My project, Parallel Mathematica Environment on the RaspberryPi using OOP, is a sample application of **Object Oriented Programming for the Mathematica** cluster computing, implemented with a Mac and three RaspberryPi Zero connected with a USB hub and three USB cables.&#xD;
&#xD;
Basic idea is to deploy a constructed instance image to calculating servers (RaspberryPi) and send messages to the instance. [OOP on the Mathematica is already developed and shown][1] in this community, and further detail is shown on [slidesshare][2] titled of &amp;#034;OOP for Mathematica.&amp;#034;&#xD;
![enter image description here][3]&#xD;
&#xD;
&#xD;
----------&#xD;
&#xD;
&#xD;
&#xD;
Preparing for RaspberryPi Zero is as follows using SSH connection from a Mac, &#xD;
&#xD;
 - naming each Zero as raspberypi,raspberrypi1,raspberrypi2,...&#xD;
 - set the server program &amp;#034;init&amp;#034; to each RaspberryPi, init is,&#xD;
&#xD;
        $ cat init&#xD;
        While[True,&#xD;
        Run[nc -l 8000&amp;gt;input];&#xD;
        temp=ReleaseHold[&amp;lt;&amp;lt;input];&#xD;
        temp &amp;gt;&amp;gt;output;&#xD;
        Run[nc your-mac-hostname.local 8002&amp;lt;output]&#xD;
        ]&#xD;
        &#xD;
&#xD;
where, socket numbers must be identical.&#xD;
&#xD;
 - Run Mathematica manually, and wait the booting Mathematica up.&#xD;
&#xD;
        $ wolfram &amp;lt;init&amp;amp;&#xD;
&#xD;
Checking each RaspberryPi is useful as,&#xD;
&#xD;
    $ nc -l 8002 &amp;gt;output|nc raspberrypi.local 8000 &amp;lt;&amp;lt;EOF&#xD;
    &amp;gt; 10!&#xD;
    &amp;gt; EOF&#xD;
    $ cat output&#xD;
    3628800&#xD;
&#xD;
&#xD;
----------&#xD;
&#xD;
Cluster controller program on a Mac is,&#xD;
&#xD;
 - set directory&#xD;
&#xD;
        SetDirectory[NotebookDirectory[]];&#xD;
&#xD;
 - setup socket communication process&#xD;
&#xD;
        com1=&amp;#034;nc -l 8002 &amp;gt;output1 |nc raspberrypi.local 8000 &amp;lt;input1&amp;#034;;&#xD;
        com2=&amp;#034;nc -l 9002 &amp;gt;output2 |nc raspberrypi1.local 9000 &amp;lt;input2&amp;#034;;&#xD;
        com3=&amp;#034;nc -l 9502 &amp;gt;output3 |nc raspberrypi2.local 9500 &amp;lt;input3&amp;#034;;&#xD;
&#xD;
 - set object property&#xD;
&#xD;
        obj={&#xD;
           &amp;lt;|&amp;#034;name&amp;#034;-&amp;gt;node1,&amp;#034;comm&amp;#034;-&amp;gt;com1,&amp;#034;in&amp;#034;-&amp;gt;&amp;#034;input1&amp;#034;,&amp;#034;out&amp;#034;-&amp;gt;&amp;#034;output1&amp;#034;,&amp;#034;p&amp;#034;-&amp;gt;{2000,3500}|&amp;gt;,&#xD;
           &amp;lt;|&amp;#034;name&amp;#034;-&amp;gt;node2,&amp;#034;comm&amp;#034;-&amp;gt;com2,&amp;#034;in&amp;#034;-&amp;gt;&amp;#034;input2&amp;#034;,&amp;#034;out&amp;#034;-&amp;gt;&amp;#034;output2&amp;#034;,&amp;#034;p&amp;#034;-&amp;gt;{3501,4000}|&amp;gt;,&#xD;
           &amp;lt;|&amp;#034;name&amp;#034;-&amp;gt;node3,&amp;#034;comm&amp;#034;-&amp;gt;com3,&amp;#034;in&amp;#034;-&amp;gt;&amp;#034;input3&amp;#034;,&amp;#034;out&amp;#034;-&amp;gt;&amp;#034;output3&amp;#034;,&amp;#034;p&amp;#034;-&amp;gt;{4000,4500}|&amp;gt;};&#xD;
&#xD;
 - define calculation server class, where is a sample Mersenne prime number calculation&#xD;
&#xD;
        new[nam_]:=Module[{ps,pe},&#xD;
           mersenneQ[n_]:=PrimeQ[2^n-1];&#xD;
           setv[nam[{s_,e_}]]^:={ps,pe}={s,e};&#xD;
           calc[nam]^:=Select[Range[ps,pe],mersenneQ]&#xD;
           ];&#xD;
&#xD;
 - construct instances&#xD;
&#xD;
        Map[new[#name]&amp;amp;,obj];&#xD;
&#xD;
 - deploy instances to calculation servers&#xD;
&#xD;
        Map[Save[#in,#name]&amp;amp;,obj];&#xD;
        Map[Run[#comm]&amp;amp;,obj];&#xD;
&#xD;
 - send message to each instance&#xD;
&#xD;
        Map[Put[Hold@setv[#name[#p]],#in]&amp;amp;,obj];&#xD;
        Map[Run[#comm]&amp;amp;,obj];&#xD;
&#xD;
 - start calculation&#xD;
&#xD;
        Map[Put[Hold@calc[#name],#in]&amp;amp;,obj];&#xD;
        proc=Map[StartProcess[{$SystemShell,&amp;#034;-c&amp;#034;,#comm}]&amp;amp;,obj]&#xD;
&#xD;
 - wait for the process termination (mannualy in this sample code)&#xD;
&#xD;
        Map[ProcessStatus[#]&amp;amp;,proc]&#xD;
         {Finished,Finished,Finished}&#xD;
&#xD;
 - gather the results&#xD;
&#xD;
        Map[FilePrint[#out]&amp;amp;,obj];&#xD;
         {2203, 2281, 3217}&#xD;
        {}&#xD;
        {4253, 4423}&#xD;
&#xD;
&#xD;
  [1]: http://community.wolfram.com/groups/-/m/t/897081?p_p_auth=o5qxZhNR&#xD;
  [2]: https://www.slideshare.net/kobayashikorio/oop-for-mathematica&#xD;
  [3]: http://community.wolfram.com//c/portal/getImageAttachment?filename=2017-04-10.jpg&amp;amp;userId=897049</description>
    <dc:creator>Hirokazu Kobayashi</dc:creator>
    <dc:date>2017-04-10T01:15:22Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/454226">
    <title>Build your own weather station in a snap with the Wolfram Cloud</title>
    <link>https://community.wolfram.com/groups/-/m/t/454226</link>
    <description>Recently Stephen Wolfram announced the [Wolfram Data Drop][1], which is a great new tool to upload any type of data from any type of device. In this post, I will show how you can use the Data Drop when building your own weather station using some basic hardware and a few lines of code. Once completed, your device will take temperature measurements every second for 60 seconds, and upload their average value to the Wolfram DataDrop every minute. This will give you 60 data points per hour and 1,440 data points per day. With this data you can then use the Wolfram Programming Cloud to understand how the temperature changes over time: You can find out the exact times in a given day when the temperature was the highest or lowest, and when the temperature changed the fastest, and maybe even use the data to make predictions for the future! Can you beat your local weather station and make a prediction that is better?&#xD;
&#xD;
![enter image description here][2]&#xD;
&#xD;
How to build your weather station&#xD;
---------------------------------&#xD;
For this experiment you will need an [Arduino Yun][3] (or an equivalent Arduino that has wireless capability), a [TMP36 temperature sensor][4] and a breadboard and jumper wires.&#xD;
&#xD;
Here is the hardware diagram. Connect the 5V pin to the left pin of the TMP36 sensor, connect the GND pin to the right pin of the TMP36, and the A0 pin to the middle TMP36 pin. &#xD;
&#xD;
![enter image description here][5]&#xD;
&#xD;
Once everything is connected and powered up, the TMP36 sensor will send a voltage to the A0 pin. This voltage increases when the temperature goes up and it decreases when the temperature goes down. So we can use the voltage reading and interpret it as a temperature. Luckily in this experiment we only needed three jumper cables, so hopefully you did not end up looking like this poor man:&#xD;
&#xD;
![enter image description here][6]&#xD;
&#xD;
Programming the Arduino&#xD;
-------------------------&#xD;
Now we are ready to write the Arduino code which is going to upload the recorded temperature data to the Wolfram Cloud. Make sure your Arduino Yun is configured to connect to the internet (http://arduino.cc/en/Guide/ArduinoYun#toc13). Then using the Arduino application, upload the following sketch onto your Arduino after you replace the text &amp;#039;YOUR_BIN_ID&amp;#039; with the &amp;#039;Short ID&amp;#039; of a Databin that you created yourself:&#xD;
&#xD;
    CreateDatabin[]&#xD;
&#xD;
![enter image description here][7]&#xD;
&#xD;
To follow along with the code, here is what it does: The process &amp;#039;p&amp;#039; variable is used for calling a tool called &amp;#039;curl&amp;#039; which is a way to make http requests with your Arduino. In our case we call a specific &amp;#039;data drop&amp;#039; url, which lets you upload small bits of temperature data easily (https://datadrop.wolframcloud.com/api/v1.0/Add). In the loop() section of the code you can see how the variable &amp;#039;val&amp;#039; is reading from the analog pin (A0) and how it is then converted from a raw reading to a &amp;#039;temperature&amp;#039; variable. This temperature is then added to an &amp;#039;average&amp;#039; variable exactly 60 times, but on the 60th time, the code will execute the block of the if statement. This code block runs the data upload code to upload the average of the 60 measurements and it also resets all the counters, so that everything will start over again. Finally, at the end, is a 1000 millisecond delay which will (approximately) space out your recorded temperatures by one second.&#xD;
&#xD;
&amp;lt;pre&amp;gt;&#xD;
#include &amp;amp;lt;Bridge.h&amp;amp;gt;&#xD;
&#xD;
Process p;&#xD;
int val,count;&#xD;
float voltage,temperature,average;&#xD;
&#xD;
void setup() {&#xD;
 count=0;&#xD;
 average=0;&#xD;
 Bridge.begin();&#xD;
 Serial.begin(9600);&#xD;
}&#xD;
&#xD;
void loop() {&#xD;
 val = analogRead(0);&#xD;
 voltage = val * 5.0;&#xD;
 voltage = voltage/1024.0;&#xD;
 temperature = (voltage-0.5)*100;&#xD;
 average += temperature;&#xD;
 count++;&#xD;
 if( count&amp;gt;59 ) {&#xD;
  p.begin(&amp;#034;/usr/bin/curl&amp;#034;);&#xD;
  p.addParameter(&amp;#034;--insecure&amp;#034;);&#xD;
  p.addParameter(&amp;#034;--location&amp;#034;);&#xD;
  p.addParameter(&amp;#034;https://datadrop.wolframcloud.com/api/v1.0/Add?bin=YOUR_BIN_ID&amp;amp;temperature=&amp;#034;+String(average/60));&#xD;
  p.run();&#xD;
  while(p.available()&amp;gt;0) {&#xD;
   char c = p.read();&#xD;
   Serial.print(c);&#xD;
  }&#xD;
  Serial.println();&#xD;
  Serial.flush();&#xD;
  count = 0;&#xD;
  average = 0;&#xD;
 }&#xD;
 delay(1000);&#xD;
}&#xD;
&amp;lt;/pre&amp;gt;&#xD;
&#xD;
To test that everything worked, you can open the Arduino serial monitor. If successful, you will see these messages like the one below appear every minute or so:&#xD;
&#xD;
    &amp;lt;|&amp;#034;Message&amp;#034; -&amp;gt; &amp;#034;The data was successfully added.&amp;#034;, &#xD;
    &amp;#034;Bin&amp;#034; -&amp;gt; &amp;#034;DD7051e03ace9-a194-44c1-9864-8fcef8ea9af3&amp;#034;, &#xD;
    &amp;#034;Data&amp;#034; -&amp;gt; &amp;lt;|&amp;#034;temperature&amp;#034; -&amp;gt; &amp;#034;34&amp;#034;|&amp;gt;, &#xD;
    &amp;#034;Timestamp&amp;#034; -&amp;gt; {2015, 2, 9, 16, 18, 39.99526`8.354583502904967}, &#xD;
    &amp;#034;Information&amp;#034; -&amp;gt; {&amp;#034;EntryCount&amp;#034; -&amp;gt; 1, &amp;#034;LatestTimestamp&amp;#034; -&amp;gt; 3632487520, &amp;#034;Size&amp;#034; -&amp;gt; 288}|&amp;gt;&#xD;
&#xD;
Now you can put your device in a weather resistant container (I used a Hefty bag) and place it outside in a location that is shaded for most of the day (like a porch):&#xD;
&#xD;
![enter image description here][8]&#xD;
&#xD;
Analysis of the temperature data&#xD;
----------------------&#xD;
Now we&amp;#039;re ready to do some interesting analysis of your temperature data! It&amp;#039;s best to collect at least one full day of data before analyzing it, but the code below should work for shorter time periods as well. First we need to get the data from the databin we used to upload the temperature data. The arduino sketch uses &amp;#039;temperature&amp;#039; as the url parameter for the temperature, so we will need to use the same thing here to retrieve it. In this example, my databin has collected data for about 20 days (for your experiment, replace the text &amp;#039;YOUR_BIN_ID&amp;#039; with the bin id shown in the output from CreateDatabin[] above):&#xD;
&#xD;
    bin = Databin[&amp;#034;YOUR_BIN_ID&amp;#034;]; &#xD;
    data = bin[&amp;#034;Data&amp;#034;];&#xD;
    temperature = data[&amp;#034;temperature&amp;#034;]&#xD;
 &#xD;
![enter image description here][9]&#xD;
&#xD;
Now we will need to transform the temperature event series in a few more steps: &#xD;
&#xD;
First, shift the series to account for the proper time zone (CST). This is done with the TimeSeriesShift command.&#xD;
&#xD;
Next, calibrate the temperatures so they match more closely official NOAA measurements for a nearby official weather station. In my case I used official an NOAA weather station (KCMI) to calibrate my $2 TMP36 temperature sensor with the undoubtably much more expensive and precise offcial sensor data. Calibration is an important step and in this case I had to correct my data by about 5 degrees Celsius to match the official data. Another good way to calibrate your TMP36 sensor is to place it into a cup with ice water (exactly 0 degrees Celsius) and a cup with boiling water  (exactly 100 degrees Celsius).&#xD;
&#xD;
Next, define the time window of interest. In my case the starting point of reliable data was on DateObject[{2015,1,22,21,0,0}]  (January 22 at 9PM). You will need to change this to a date which is a good starting point for your data.&#xD;
&#xD;
Finally, resample the data to evenly spaced periods of 15 minutes. You will quickly notice that recording data every minute will give you a massive amount of data points and sampling it back to 15 minute intervals will still give you enough data to work with for plots that show multiple days of data.&#xD;
&#xD;
Here is all the code we just discussed above:&#xD;
&#xD;
    temperature = EventSeries[Cases[First[temperature[&amp;#034;Paths&amp;#034;]], {_Real, _Real}]]; &#xD;
    temperature = TimeSeriesShift[temperature, Quantity[-6, &amp;#034;Hours&amp;#034;]];&#xD;
    temperature = TimeSeriesMap[# - 5 &amp;amp;, temperature];&#xD;
    temperature = TimeSeriesWindow[ temperature, {DateObject[{2015, 1, 22, 21, 0, 0}], Now}];&#xD;
    temperature = TimeSeriesResample[temperature, Quantity[0.25, &amp;#034;Hours&amp;#034;], ResamplingMethod -&amp;gt; {&amp;#034;Interpolation&amp;#034;, InterpolationOrder -&amp;gt; 1}];&#xD;
&#xD;
Now at this point you can do a quick check that your temperature data is looking OK:&#xD;
&#xD;
    DateListPlot[temperature, GridLines -&amp;gt; Automatic]&#xD;
&#xD;
![enter image description here][10]&#xD;
&#xD;
But we can make this a lot more useful and interesting. Let&amp;#039;s write a function which will collect a specific point of interest for each day of data, for example the Min and Max of this data:&#xD;
&#xD;
    TimeSeriesFilter[series_, func_] := Module[{startdate, enddate, daterange}, &#xD;
      startdate = First[series[&amp;#034;Dates&amp;#034;]] /. {DateObject[{y_, m_, d_}, TimeObject[{_, _, _}]] :&amp;gt; DateObject[{y, m, d}, TimeObject[{0, 0, 0}]]};&#xD;
      enddate = Last[series[ &amp;#034;Dates&amp;#034;]] /. {DateObject[{y_, m_, d_}, TimeObject[{_, _, _}]] :&amp;gt; DateObject[{y, m, d}, TimeObject[{0, 0, 0}]]};&#xD;
      daterange = DateRange[startdate, enddate, Quantity[1, &amp;#034;Days&amp;#034;]];&#xD;
      EventSeries[ &#xD;
       Table[window =  TimeSeriesWindow[ series, {date, DatePlus[date, Quantity[1, &amp;#034;Days&amp;#034;]]}];&#xD;
       path = window[&amp;#034;Path&amp;#034;];&#xD;
       First[Select[path, Last[#] == func[window] &amp;amp;]], {date, daterange}]&#xD;
     ]&#xD;
    ]&#xD;
&#xD;
This function above simply generates a new EventSeries from the given one and collects the data points that satisfy a particular function for a given day.&#xD;
&#xD;
First let&amp;#039;s create a new EventSeries which contains all the daily minimum temperatures:&#xD;
&#xD;
    minseries = TimeSeriesFilter[temperature, Min]&#xD;
&#xD;
![enter image description here][11]&#xD;
&#xD;
     maxseries = TimeSeriesFilter[temperature, Max]&#xD;
&#xD;
![enter image description here][12]&#xD;
&#xD;
And now we can plot the temperature data (purple) with the daily minimum temperatures (blue dots) and maximum temperatures (red dots):&#xD;
&#xD;
    DateListPlot[{temperature, minseries, maxseries}, &#xD;
     GridLines -&amp;gt; Automatic, Filling -&amp;gt; Axis, &#xD;
     PlotLabel -&amp;gt; Style[&amp;#034;Temperature (\[Degree]C) in Savoy, Illinois&amp;#034;, 16, FontColor -&amp;gt; White], ImageSize -&amp;gt; Large, &#xD;
     Joined -&amp;gt; {True, False, False}, Background -&amp;gt; Black, &#xD;
     FrameStyle -&amp;gt; White, ImagePadding -&amp;gt; 30, &#xD;
     PlotStyle -&amp;gt; {Purple, {Blue, AbsolutePointSize[6]}, {Red, AbsolutePointSize[6]}}]&#xD;
&#xD;
![enter image description here][13]&#xD;
&#xD;
Conclusion&#xD;
-------------&#xD;
And that is it for this experiment! You now have a working weather station, and a dataset which is easy to analyze: By modifying the given code you can visualize daily averages or weekly or monthly averages. Or you can try to make predictions for tomorrow&amp;#039;s weather based on patterns you have observed in the past (and perhaps combine this data with additional weather measurements like pressure and humidity data).&#xD;
&#xD;
&#xD;
  [1]: https://datadrop.wolframcloud.com/&#xD;
  [2]: /c/portal/getImageAttachment?filename=one.png&amp;amp;userId=22112&#xD;
  [3]: http://arduino.cc/en/Main/ArduinoBoardYun?from=Products.ArduinoYUN&#xD;
  [4]: https://www.sparkfun.com/products/10988&#xD;
  [5]: /c/portal/getImageAttachment?filename=two.png&amp;amp;userId=22112&#xD;
  [6]: /c/portal/getImageAttachment?filename=three.png&amp;amp;userId=22112&#xD;
  [7]: /c/portal/getImageAttachment?filename=four.png&amp;amp;userId=22112&#xD;
  [8]: /c/portal/getImageAttachment?filename=five.png&amp;amp;userId=22112&#xD;
  [9]: /c/portal/getImageAttachment?filename=six.png&amp;amp;userId=22112&#xD;
  [10]: /c/portal/getImageAttachment?filename=seven.png&amp;amp;userId=22112&#xD;
  [11]: /c/portal/getImageAttachment?filename=eight.png&amp;amp;userId=22112&#xD;
  [12]: /c/portal/getImageAttachment?filename=nine.png&amp;amp;userId=22112&#xD;
  [13]: /c/portal/getImageAttachment?filename=ten.png&amp;amp;userId=22112</description>
    <dc:creator>Arnoud Buzing</dc:creator>
    <dc:date>2015-03-06T22:05:52Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/992466">
    <title>Classifier for Human Motions with data from an accelerometer</title>
    <link>https://community.wolfram.com/groups/-/m/t/992466</link>
    <description>This project was part of a Wolfram Mentorship Program.&#xD;
&#xD;
The classification of human motions based on patterns and physical data is of great importance in developing areas such as robotics. Also, a function that recognizes a specific human motion can be an important addition to artificial intelligence and physiological monitoring systems. This project is about acquiring, curating and analyzing experimental data from certain actions such as walking, running and climbing stairs. The data taken with the help of an accelerometer needs to be turned into an acceptable input for the Classify function. Finally, the function can be updated with more data and classes to make it more efficient and whole.&#xD;
&#xD;
**Algorithms and procedures**&#xD;
&#xD;
The data for this project was acquired by programming an Arduino UNO microprocessor with a Raspberry Pi computer, using Wolfram Language. An accelerometer connected to the Arduino sent measurements each time it was called upon, and Mathematica in the Raspberry Pi collected and uploaded the data. &#xD;
The raw data had to be processed for it to be a good input for the classify function. First, it was transformed into an spectrogram (to analyze the frequency domain of the data). Then, the spectrogram&amp;#039;s image was put through the IFData function which filters out some of the noise, and finally the images were converted into numerical data with the UpToMeasurements function (main function: ComponentMeasurements).&#xD;
This collection numerical data was put in a classify function under six different classes (standing, walking, running, jumping and waving).&#xD;
&#xD;
*The IFData function and the UpToMeasurements functions were sent to me by Todd Rowland during the Mentorship. Both functions will be shown at the end of this post.&#xD;
&#xD;
**Example visualization**&#xD;
&#xD;
The following ListLinePlot is an extract from the jumping data &#xD;
&#xD;
![Example data][1]&#xD;
&#xD;
Next, the data from the plot above is turned into a spectrogram by the function Spectrogram, i.e.:   &#xD;
&#xD;
    spectrogramImage = &#xD;
     Spectrogram[jumpingData, SampleRate -&amp;gt; 10, FrameTicks -&amp;gt; None, &#xD;
      Frame -&amp;gt; False, Ticks -&amp;gt; None, FrameLabel -&amp;gt; None]&#xD;
&#xD;
&#xD;
&#xD;
![Example jumping data spectrogram][2]&#xD;
&#xD;
Finally, all the spectrogram images are used as input for the UpToMeasurements function, along with some properties for the ComponentMeasurements function:&#xD;
&#xD;
 i.e:  &#xD;
&#xD;
    numericalData = &#xD;
     N@Flatten[&#xD;
       UpToMeasurements[&#xD;
        spectrogramImage, {&amp;#034;EnclosingComponentCount&amp;#034;, &amp;#034;Max&amp;#034;, &#xD;
         &amp;#034;MaxIntensity&amp;#034;, &amp;#034;TotalIntensity&amp;#034;, &amp;#034;StandardDeviationIntensity&amp;#034;, &#xD;
         &amp;#034;ConvexCoverage&amp;#034;, &amp;#034;Total&amp;#034;, &amp;#034;Skew&amp;#034;, &amp;#034;FilledCircularity&amp;#034;, &#xD;
         &amp;#034;MaxCentroidDistance&amp;#034;, &amp;#034;ExteriorNeighborCount&amp;#034;, &amp;#034;Area&amp;#034;, &#xD;
         &amp;#034;MinCentroidDistance&amp;#034;, &amp;#034;FilledCount&amp;#034;, &amp;#034;MeanIntensity&amp;#034;, &#xD;
         &amp;#034;StandardDeviation&amp;#034;, &amp;#034;Energy&amp;#034;, &amp;#034;Count&amp;#034;, &amp;#034;MeanCentroidDistance&amp;#034;}, &#xD;
        1]]&#xD;
&#xD;
Which outputs a list of real numbers, one for each of the properties:&#xD;
&#xD;
    {0., 1., 1., 1., 1., 19294.9, 0.222164, 0.985741, 31011.8, 15212.5, \&#xD;
    9624.42, -0.0596506, 0.724527, 190.534, 0., 42584.5, 0.364667, \&#xD;
    42584., 0.453101, 0.315209, 0.232859, 0.169549, 0.00909654, 42584., \&#xD;
    98.7136}&#xD;
&#xD;
These numbers are grouped in a nested list which contains data for all 5 human motions. All the data is lastly classified in a classifier using the Classify function.&#xD;
&#xD;
After several combinations of both properties and data sets, I was able to produce classifier functions with an accuracy of 91%, and a total size of 269kb. &#xD;
&#xD;
------------------------------------------------------------&#xD;
&#xD;
**Attempt on building a classify function using image processing**&#xD;
&#xD;
On the other hand,  the image processing capabilities of Mathematica lets us extract data from images, hence it should be possible to create a classifier which recognizes the moving patterns in the frames of a video. First, I had to take the noise out of every image, this proved to be troublesome, since the background can vary greatly between video samples. Then, I binarized the image in order to isolate the moving particles in each frame, and extract their position with ImageData. Lastly, a data set can be formed from all the analyzed frames; this data can essentially be used in the same way as the accelerometer&amp;#039;s, but the classifier was unsuccessful in separating the samples accurately. &#xD;
This was mainly because the accelerometer&amp;#039;s data is taken at a constant rate and very precisely, whereas the images depend on the camera&amp;#039;s frame rate, and many other external factors. This is what made the data different enough to fail being classified with accuracy. Furthermore, if a big dataset is made from videos of people performing certain actions, the data processing can follow similar steps as the ones explained in this report. Thus producing a similar classifier function. This can further increase the functions accuracy, but the process needs an algorithm that can effectively trace the path of &amp;#034;a particle&amp;#034; that moves through each of the frames of the video, and extract precise velocity data from said movement.&#xD;
&#xD;
------------------------------------------------------------&#xD;
&#xD;
Conclusively, the classify function is working very well with the data provided, its accuracy is about 91% for the SupportVectorMachine method. This is a very good result for the human motion classifier. The next step is to add more classes to the function, and test the classifier with data acquired from different sources, such as another accelerometer and various videos of human motion footage.&#xD;
&#xD;
-----------------------------------------&#xD;
&#xD;
**Code:**&#xD;
&#xD;
 - UpToMeasurements function&#xD;
&#xD;
        UpToMeasurements[image_,property_,n_]:=MaximalBy[ComponentMeasurements[image,&amp;#034;Count&amp;#034;],Last,UpTo[n]][[All,1]]/.ComponentMeasurements[image,property]&#xD;
&#xD;
*Note: This function simplifies the exploration of properties to input in ComponentMeasurements, also, it outputs a usable list of numerical data retrieved from a given group of images.&#xD;
&#xD;
 - IFData function:&#xD;
&#xD;
        imagefunctions=&amp;lt;|1-&amp;gt; (EntropyFilter[#,3]&amp;amp;),&#xD;
        2-&amp;gt; (EdgeDetect[EntropyFilter[#,3]]&amp;amp;),&#xD;
        3-&amp;gt;Identity,&#xD;
        4-&amp;gt; (ImageAlign[reference110,#]&amp;amp;),&#xD;
        5-&amp;gt; (ImageHistogram[#,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,FrameLabel-&amp;gt;None,Ticks-&amp;gt;None]&amp;amp;),&#xD;
        6-&amp;gt; (ImageApply[#^.6&amp;amp;,#]&amp;amp;),&#xD;
        7-&amp;gt; (Colorize[MorphologicalComponents[#]]&amp;amp;),&#xD;
        8-&amp;gt; (HighlightImage[#,ImageCorners[#,1,.001,5]]&amp;amp;),&#xD;
        9-&amp;gt; (HighlightImage[#,Graphics[Disk[{200,200},200]]]&amp;amp;),&#xD;
        10-&amp;gt; ImageRotate,&#xD;
        11-&amp;gt; (ImageRotate[#,45Degree]&amp;amp;),&#xD;
        12-&amp;gt;(ImageTransformation[#,Sqrt]&amp;amp;),&#xD;
        13-&amp;gt;(ImageTransformation[#,Function[p,With[{C=150.,R=35.},{p[[1]]+(R*Cos[(p[[1]]-C)*360*2/R]/6),p[[2]]}]]]&amp;amp;),&#xD;
        14-&amp;gt;( Dilation[#,DiskMatrix[4]]&amp;amp;),&#xD;
        15-&amp;gt;( ImageSubtract[Dilation[#,1],#]&amp;amp;),&#xD;
        16-&amp;gt; (Erosion[#,DiskMatrix[4]]&amp;amp;),&#xD;
        17-&amp;gt; (Opening[#,DiskMatrix[4]]&amp;amp;),&#xD;
        18-&amp;gt;(Closing[#,DiskMatrix[4]]&amp;amp;),&#xD;
        19-&amp;gt;DistanceTransform,&#xD;
        20-&amp;gt; InverseDistanceTransform,&#xD;
        21-&amp;gt; (HitMissTransform[#,{{1,-1},{-1,-1}}]&amp;amp;),&#xD;
        22-&amp;gt;(TopHatTransform[#,5]&amp;amp;),&#xD;
        23-&amp;gt;(BottomHatTransform[#,5]&amp;amp;), &#xD;
        24-&amp;gt; (MorphologicalTransform[Binarize[#],Max]&amp;amp;),&#xD;
        25-&amp;gt; (MorphologicalTransform[Binarize[#],&amp;#034;EndPoints&amp;#034;]&amp;amp;),&#xD;
        26-&amp;gt;MorphologicalGraph,&#xD;
        27-&amp;gt;SkeletonTransform,&#xD;
        28-&amp;gt;Thinning,&#xD;
        29-&amp;gt;Pruning,&#xD;
        30-&amp;gt; MorphologicalBinarize,&#xD;
        31-&amp;gt; (ImageAdjust[DerivativeFilter[#,{1,1}]]&amp;amp;),&#xD;
        32-&amp;gt; (GradientFilter[#,1]&amp;amp;),&#xD;
        33-&amp;gt; MorphologicalPerimeter,&#xD;
        34-&amp;gt; Radon&#xD;
        |&amp;gt;;&#xD;
        &#xD;
        reference110=BlockRandom[SeedRandom[&amp;#034;110&amp;#034;];Image[CellularAutomaton[110,RandomInteger[1,400],400]]];&#xD;
        &#xD;
        IFData[n_Integer]:=Lookup[imagefunctions,n,Identity]&#xD;
        &#xD;
        IFData[&amp;#034;Count&amp;#034;]:=Length[imagefunctions]&#xD;
        &#xD;
        IFData[All]:=imagefunctions&#xD;
&#xD;
*Note: This function groups together several image filtering fuctions; it was used to simplify the exploration of functions to be used in the classifier. &#xD;
**This function was written by the the Wolfram team, but was slightly modified for this project.&#xD;
&#xD;
 - propertyVector function (this function automatically evaluates all the prior necessary code needed to create the classify functions):&#xD;
&#xD;
        propertyVector[property_]:={walkingvector=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@walk);&#xD;
        jumpingvector=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@jump);&#xD;
        standingvector=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@stand);&#xD;
        runningvector=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@run);&#xD;
        wavingvector=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@wave);&#xD;
        stairsvector=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@stairs);&#xD;
        walkingvectortest=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@testwalk);&#xD;
        jumpingvectortest=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@testjump);&#xD;
        standingvectortest=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@teststand);&#xD;
        runningvectortest=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@testrun);&#xD;
        wavingvectortest=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@testwave);&#xD;
        stairsvectortest=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@teststairs);}&#xD;
        &#xD;
        Training:=trainingSet=&amp;lt;|&amp;#034;walking&amp;#034;-&amp;gt;walkingvector,&amp;#034;running&amp;#034;-&amp;gt;runningvector,&#xD;
        &amp;#034;standing&amp;#034;-&amp;gt; standingvector,&#xD;
        &amp;#034;jumping&amp;#034;-&amp;gt; jumpingvector,&#xD;
        &amp;#034;waving&amp;#034;-&amp;gt; wavingvector,&#xD;
        &amp;#034;stairs&amp;#034;-&amp;gt; stairsvector|&amp;gt;;&#xD;
        &#xD;
        Test:=testSet=&amp;lt;|&amp;#034;walking&amp;#034;-&amp;gt;walkingvectortest,&amp;#034;running&amp;#034;-&amp;gt;runningvectortest,&#xD;
        &amp;#034;standing&amp;#034;-&amp;gt; standingvectortest,&#xD;
        &amp;#034;jumping&amp;#034;-&amp;gt; jumpingvectortest,&#xD;
        &amp;#034;waving&amp;#034;-&amp;gt; wavingvectortest,&#xD;
        &amp;#034;stairs&amp;#034;-&amp;gt; stairsvectortest|&amp;gt;;&#xD;
&#xD;
 - Example code for the acceleration data acquisition from image processing:&#xD;
&#xD;
        images=Import[&amp;#034;$path&amp;#034;]&#xD;
        motionData=&#xD;
        Count[#,1]&amp;amp;/@ &#xD;
          (Flatten[    	&#xD;
          	ImageData[Binarize[ImageSubtract[ImageSubtract[#[[1]],#[[2]]],ImageSubtract[#[[2]],#[[3]]]]]]&amp;amp;/@&#xD;
        		  Partition[images,3,1],1])&#xD;
&#xD;
*Note: before this code can be used, the backgrounds of the frames of the video have to be removed, and the image has to be binarized as much as possible (some examples will be shown in the next section).&#xD;
&#xD;
 - Example code for the retrieval of raw data from DataDrop:&#xD;
&#xD;
        rawData=Values[Databin[&amp;#034;Serial#&amp;#034;, {#}]];&#xD;
        data=Flatten[rawData[&amp;#034;(xacc/yacc/zacc)&amp;#034;]];&#xD;
&#xD;
---------------------------------&#xD;
&#xD;
**Please feel free to contact me or comment if you are interested in the rest of the code ( uploading the C code to the Arduino, the manufacturer&amp;#039;s code for the accelerometer, C code switch that lets Mathematica communicate with the Arduino, and the Wolfram Language code used to start each loop in the switch that retrieves data ). Also, I could send the classify function, or any other information that I might have left out; all suggestions welcome.&#xD;
&#xD;
  [1]: http://community.wolfram.com//c/portal/getImageAttachment?filename=1.png&amp;amp;userId=602285&#xD;
  [2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=2.png&amp;amp;userId=602285</description>
    <dc:creator>Pablo Ruales</dc:creator>
    <dc:date>2017-01-11T01:15:04Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/822762">
    <title>IoT: Controlling an RGB LED with the Wolfram Cloud, GPIO pins</title>
    <link>https://community.wolfram.com/groups/-/m/t/822762</link>
    <description>Using the Wolfram Cloud in conjunction with an embedded Linux device allows one to create neat IoT applications. Here, by using a Beaglebone Black (BBB) and its plethora of IO ports, it was possible to control an RGB LED connected using the cloud.&#xD;
&#xD;
Although I&amp;#039;m using a Beaglebone Black for this project to show that it is possible to use Wolfram Language on other ARM devices, this can most likely be done in a similar manner on a Raspberry Pi.&#xD;
&#xD;
#Parts Used#&#xD;
&#xD;
 - 1 x Common Cathode RGB LED&#xD;
 - 3 x 220 ? Resistors&#xD;
 - 4 x Jumper Wires&#xD;
 - 1 x Beaglebone Black (of course!)&#xD;
&#xD;
The BBB is running Debian Jessie off of a MicroSD card. Since running the Mathematica front-end is a little resource heavy on the BBB, I would not suggest developing the functions in a notebook, but to create an .m file instead that contains all the functions and just `Get` the .m file in the notebook. For faster development, I recommend using `ssh` to connect to the BBB and editing the file using a terminal editor.&#xD;
&#xD;
![Fritzing Diagram][1]&#xD;
&#xD;
In order for these instructions to work, you will need to run `CloudConnect[]` in a notebook and log in to your cloud account on your BBB.&#xD;
&#xD;
In order to be able to use our GPIO pins on the Beaglebone Black, we have to set them up. Here, I have connected the red pin to P9\_12, green to P9\_15, blue to P9\_23, and of course, DGND (reminder: this is common cathode!). If you&amp;#039;re wondering what all that means, refer to [this][2] where P9\_12 is referring to pin 12 on the P9 header. &#xD;
&#xD;
![enter image description here][3]&#xD;
&#xD;
Looking at the image, P9\_12 appears to be referring to GPIO\_60, P9\_15 to GPIO\_48 and P9\_23 refers to GPIO\_49. So, if you&amp;#039;re using the same connections for your RGB LED as I am, you can go ahead and run the following code in the command line on the BBB as root:&#xD;
&#xD;
    echo 60 &amp;gt; /sys/class/gpio/export&#xD;
    echo 49 &amp;gt; /sys/class/gpio/export&#xD;
    echo 48 &amp;gt; /sys/class/gpio/export&#xD;
&#xD;
For each pin, there is a file that is within the `/sys/class/gpio/gpio*` folder that sets the direction and another that sets the value. The following functions constructs the path to those files for a given pin and type (value or direction).&#xD;
&#xD;
    file[pin_Integer, type_String]:= &amp;#034;/sys/class/gpio/gpio&amp;#034; &amp;lt;&amp;gt; ToString[pin] &amp;lt;&amp;gt; &amp;#034;/&amp;#034; &amp;lt;&amp;gt; type&#xD;
&#xD;
    outputDigitalString[pin_Integer, type_String] := &amp;#034;echo &amp;#034; &amp;lt;&amp;gt; type &amp;lt;&amp;gt; &amp;#034; &amp;gt; &amp;#034; &amp;lt;&amp;gt; file[pin, &amp;#034;direction&amp;#034;]&#xD;
    outputDigitalString[pin_Integer, bool_?BooleanQ] := &amp;#034;echo &amp;#034; &amp;lt;&amp;gt; ToString[Boole[bool]] &amp;lt;&amp;gt; &amp;#034; &amp;gt; &amp;#034; &amp;lt;&amp;gt; file[pin, &amp;#034;value&amp;#034;]&#xD;
&#xD;
The `outputDigitalString` function constructs a string command that specifies the direction and the value of each pin depending on whether the second argument is a string or boolean respectively. Since the digital pins direction has to be set first before outputting the value, the following function uses both strings constructed to set the direction and the value.&#xD;
&#xD;
    outputDigital[pin_Integer, bool_?BooleanQ] := (Run[outputDigitalString[pin, &amp;#034;out&amp;#034;]]; Run[outputDigitalString[pin, bool]];)&#xD;
&#xD;
Finally, the following function is just a helper function to deal with the case when we provide the function with pin1 instead of 1 to address the pins.&#xD;
&#xD;
    outputDigital[s_String, rest___] := outputDigital[ToExpression@StringReplace[s, &amp;#034;pin&amp;#034; -&amp;gt; &amp;#034;&amp;#034;], rest]&#xD;
&#xD;
Next, we begin the functions that will allow us to communicate through the Wolfram Cloud.&#xD;
&#xD;
    f[rules_] := CloudPut[rules, &amp;#034;ColorForm&amp;#034;, Permissions -&amp;gt; &amp;#034;Public&amp;#034;]&#xD;
    form = FormFunction[{{&amp;#034;pin60&amp;#034;, &amp;#034;Red&amp;#034;} -&amp;gt; &amp;#034;Boolean&amp;#034;, {&amp;#034;pin48&amp;#034;, &amp;#034;Green&amp;#034;} -&amp;gt; &amp;#034;Boolean&amp;#034;, {&amp;#034;pin49&amp;#034;, &amp;#034;Blue&amp;#034;} -&amp;gt; &amp;#034;Boolean&amp;#034;}, f, HTML];&#xD;
&#xD;
This makes a `FormFunction` that will have three checkboxes for each of the three colors. After submitting, the output will be `CloudPut` to a URI &amp;#034;ColorForm&amp;#034;. Changing the permissions can be omitted if you would not like to share the link.&#xD;
 &#xD;
Now, we will head over to our notebook (**Reminder: you must have Mathematica opened as root in order for this to work!**):&#xD;
&#xD;
    Get[&amp;#034;/home/debian/CloudFunctions.m&amp;#034;]&#xD;
&#xD;
Where you can change the `/home/debian/CloudFunction.m` to point to the .m file that you created. Next: &#xD;
&#xD;
    CloudDeploy[form, Permissions -&amp;gt; &amp;#034;Public&amp;#034;]&#xD;
&#xD;
Again, it is not necessary to change the Permissions here if you&amp;#039;re not planning on sharing the link. Run the code, and open the link that is given to you. From there, you can optionally choose a color(s) and submit. You will be redirected and a link will be given to you, which should have &amp;#034;ColorForm&amp;#034; at the end of the URL. Copy this URL and paste it into your .m file as `$formURL`. Then, go back to your notebook and run the following:&#xD;
&#xD;
    oldInput = {};&#xD;
    While[True,&#xD;
     userInput = Normal[CloudGet[CloudObject[$formURL]]];&#xD;
     If[oldInput =!= userInput,&#xD;
      Replace[userInput,&#xD;
       HoldPattern[tag_ -&amp;gt; value_] :&amp;gt; outputDigital[tag, value], 1]];&#xD;
     oldInput = userInput&#xD;
     ]&#xD;
&#xD;
The reason for the creation of the `oldInput` variable is to make sure that the form has been changed, otherwise the LED will flicker as the BBB attempts to change the color.&#xD;
&#xD;
![Form][4]&#xD;
&#xD;
There you have it! Now you can run the `While` loop and submit your colors through the form and watch them change *almost* immediately.&#xD;
&#xD;
If you would like to do a similar project with *more* color flexibility, take a look at [this][5].&#xD;
&#xD;
This is my first post here and I would appreciate any feedback or suggestions!&#xD;
&#xD;
&#xD;
  [1]: http://community.wolfram.com//c/portal/getImageAttachment?filename=8934RGBLEDDiagram.png&amp;amp;userId=766924&#xD;
  [2]: http://beagleboard.org/static/images/cape-headers.png&#xD;
  [3]: http://community.wolfram.com//c/portal/getImageAttachment?filename=cape-headers.png&amp;amp;userId=11733&#xD;
  [4]: http://community.wolfram.com//c/portal/getImageAttachment?filename=Form.png&amp;amp;userId=766924&#xD;
  [5]: http://community.wolfram.com/groups/-/m/t/824540</description>
    <dc:creator>Armeen Mahdian</dc:creator>
    <dc:date>2016-03-13T17:44:07Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/344241">
    <title>Reading high resolution weather data from Netatmo</title>
    <link>https://community.wolfram.com/groups/-/m/t/344241</link>
    <description>Together with [Björn Schelter][1] I have tried to read in data from the personal weather station [Netatmo][2] &#xD;
&#xD;
![enter image description here][3]&#xD;
&#xD;
which, as it turns out, is a very good companion for Mathematica; it also features in the [connected devices list of Wolfram][4].This device measures temperature, humidity, pressure, noise level and potentially the precipitation indoors and outdoors. Users are encouraged to share the outdoors data; as the weather station is rather popular there are lots of measurements. On their website [https://www.netatmo.com/][5] the company makes these measurements available. You can represent worldwide data &#xD;
&#xD;
![enter image description here][6]&#xD;
&#xD;
(I know that that figure does not show the entire world!) or zoom in to street level data:&#xD;
&#xD;
![enter image description here][7]&#xD;
&#xD;
On the website [https://dev.netatmo.com][8] you can sign up for a developer account which gives you access to the API of netatmo. In this post I am going to show how to access the data with Mathematica. When you sign up for a netatmo developer account you will be issued a client id and a client secret. These are rather long strings. You will also get a username and a password for your account. Next you need to request an access token, which you can do via the following command:&#xD;
&#xD;
&amp;gt; curl -X POST -d &#xD;
&amp;gt; &amp;#034;grant_type=password&amp;amp;client_id=AAAAAAAAAAAAAAAAAAAA&amp;amp;client_secret=BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB&#xD;
&amp;gt; &amp;amp;username=XXXXXXXXXXXXXXXX&amp;amp;password=YYYYYYYYYYYYYYY&amp;amp;scope=read_station&amp;#034;&#xD;
&amp;gt; http://api.netatmo.net/oauth2/token&amp;gt; ~/Desktop/request-token.txt&#xD;
&#xD;
On a Mac I generate a file called netatmo.sh containing that string on the desktop; I obviously substitute AAAAAAAAAAAA by the client id, BBBBBBBBBBBBBBBB by the client secret, XXXXXXXXXXXX and YYYYYYYYYY by the user name and the password. Then I use the terminal command &#xD;
&#xD;
&amp;gt; chmod a+x netatmo.sh&#xD;
&#xD;
The rest is child&amp;#039;s play. We need to execute the command &#xD;
&#xD;
    Run[&amp;#034;~/Desktop/netatmo.sh&amp;#034;];&#xD;
    data = Import[&amp;#034;https://api.netatmo.net/api/getpublicdata?access_token=&amp;#034;&amp;lt;&amp;gt;Last[StringSplit[Import[&amp;#034;~/Desktop/request-token.txt&amp;#034;, &amp;#034;CSV&amp;#034;][[1, 1]], &amp;#034;\&amp;#034;&amp;#034;]] &amp;lt;&amp;gt; &#xD;
        &amp;#034;&amp;amp;lat_ne=59.91&amp;amp;lon_ne=13.75&amp;amp;lat_sw=40.42&amp;amp;lon_sw=-20.0&amp;amp;filter=True&amp;#034;, &amp;#034;Text&amp;#034;];&#xD;
&#xD;
Note that the numbers following lat_ne, lon_ne, lat_sw, lon_sw are the north east and south west latitudes and longitudes. If we want to request data for other regions we can do so by changing these entries. Next we clean the data a little bit:&#xD;
&#xD;
    tab = Quiet[&#xD;
    Select[Select[Table[ToExpression /@ Flatten[StringSplit[#, &amp;#034;]&amp;#034;] &amp;amp; /@ StringSplit[#, &amp;#034;[&amp;#034;] &amp;amp; /@ &#xD;
    If[Length[StringSplit[StringSplit[data, &amp;#034;place&amp;#034;][[k]], &amp;#034;,&amp;#034;]] &amp;gt; 12, Drop[StringSplit[StringSplit[data, &amp;#034;place&amp;#034;][[k]],&amp;#034;,&amp;#034;], {5}], &#xD;
    StringSplit[StringSplit[data, &amp;#034;place&amp;#034;][[k]], &amp;#034;,&amp;#034;]]][[{2, 3, 7, 8, 15}]], {k, 2, Length[StringSplit[data, &amp;#034;place&amp;#034;]]}], Length[Cases[Flatten[#], $Failed]] == 0 &amp;amp;  ], Length[#] == 5 &amp;amp;]];&#xD;
&#xD;
That does look a bit cryptic but gives us what we want. &#xD;
&#xD;
    tab[[1]]&#xD;
&#xD;
gives {10.5673, 59.8929, 13, 80, 1032.9}, wich are the gps coordinates, the temperature in Celsius, the humidity in % and the pressure in mbar. I will now propose three different representations of the data.&#xD;
&#xD;
    scaled = Rescale[tab[[All, 3]]]; &#xD;
    GeoGraphics[Table[{GeoStyling[Opacity[0.99], RGBColor[scaled[[k]], 1 - scaled[[k]], 0]], GeoDisk[{tab[[k, 2]], tab[[k, 1]]}, Quantity[20, &amp;#034;Kilometers&amp;#034;] ]}, {k,1, Length[tab]}]]&#xD;
&#xD;
which gives:&#xD;
&#xD;
![enter image description here][9]&#xD;
&#xD;
The second representation is calculated using &#xD;
&#xD;
    GeoRegionValuePlot[GeoPosition[{#[[2]], #[[1]]}] -&amp;gt; #[[3]] &amp;amp; /@ tab, PlotRange -&amp;gt; {0, 30}, ColorFunction -&amp;gt; &amp;#034;TemperatureMap&amp;#034;, ImageSize -&amp;gt; Full]&#xD;
&#xD;
which looks like this&#xD;
&#xD;
![enter image description here][10]&#xD;
&#xD;
Finally, the lengthy sequence of commands&#xD;
&#xD;
    surface = Interpolation[{{#[[1]], #[[2]]}, #[[3]]} &amp;amp; /@ tab, InterpolationOrder -&amp;gt; 1];&#xD;
    cPlot = Quiet[ContourPlot[surface[x, y], {x, Min[tab[[All, 1]]], Max[tab[[All, 1]]]}, {y, Min[tab[[All, 2]]], Max[tab[[All, 2]]]}, ImagePadding -&amp;gt; None, &#xD;
    ClippingStyle -&amp;gt; None, Frame -&amp;gt; None, Contours -&amp;gt; 60, ContourLines -&amp;gt; False, PlotRange -&amp;gt; {0, 30}, ColorFunction -&amp;gt; &amp;#034;TemperatureMap&amp;#034;]];&#xD;
    multipoly = Polygon[GeoPosition[Join @@ (EntityValue[EntityClass[&amp;#034;Country&amp;#034;, &amp;#034;Europe&amp;#034;], &amp;#034;Polygon&amp;#034;] /. Polygon[GeoPosition[x_]] :&amp;gt; x)]];&#xD;
    GeoGraphics[{GeoStyling[{&amp;#034;GeoImage&amp;#034;, cPlot}], multipoly, Black, Opacity[1]}, ImageSize -&amp;gt; Full]&#xD;
&#xD;
gives this representation&#xD;
&#xD;
![enter image description here][11]&#xD;
&#xD;
I am quite sure that with some modifications one can make a useful program out of this if one uses cloud deploy. Also, netatmo&amp;#039;s data are updated every 30 minutes (every 5 minutes on the individual devices), so one can run a scheduled task and look at the development of the temperature. The large number of netatmo weather stations complements the data available from the Wolfram Data servers very nicely as they provide very up to date street level data.&#xD;
&#xD;
I would be glad to see a good idea of a cloud deployed service based on this or any other ideas that you might have.&#xD;
&#xD;
Cheers,&#xD;
&#xD;
Marco&#xD;
&#xD;
&#xD;
  [1]: http://community.wolfram.com/web/bschelter&#xD;
  [2]: https://www.netatmo.com/en-US/product/weather-station&#xD;
  [3]: /c/portal/getImageAttachment?filename=ScreenShot2014-09-15at22.55.05.png&amp;amp;userId=48754&#xD;
  [4]: http://devices.wolfram.com/devices/netatmo-weather-station.html&#xD;
  [5]: https://www.netatmo.com/weathermap&#xD;
  [6]: /c/portal/getImageAttachment?filename=ScreenShot2014-09-15at22.59.02.png&amp;amp;userId=48754&#xD;
  [7]: /c/portal/getImageAttachment?filename=ScreenShot2014-09-15at22.59.37.png&amp;amp;userId=48754&#xD;
  [8]: https://dev.netatmo.com&#xD;
  [9]: /c/portal/getImageAttachment?filename=netatmofig1.gif&amp;amp;userId=48754&#xD;
  [10]: /c/portal/getImageAttachment?filename=Netatmofig2.gif&amp;amp;userId=48754&#xD;
  [11]: /c/portal/getImageAttachment?filename=Netatmofig3.gif&amp;amp;userId=48754</description>
    <dc:creator>Marco Thiel</dc:creator>
    <dc:date>2014-09-15T22:35:11Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/170725">
    <title>Building a sous-vide controller using Raspberry Pi / Mathematica</title>
    <link>https://community.wolfram.com/groups/-/m/t/170725</link>
    <description>Sous vide is the method of cooking food in airtight bags using a water bath at a precise temperature.
The method is fantastic to get the meat and seafood cooked evenly at the cooking point you love most.

[url=http://modernistcuisine.com/2013/01/why-cook-sous-vide/]http://modernistcuisine.com/2013/01/why-cook-sous-vide/[/url]
[url=http://www.douglasbaldwin.com/sous-vide.html]http://www.douglasbaldwin.com/sous-vide.html
[/url]
Buying an off the shelf sous vide can run for several hundreds of dollars. One of the many ways you can explore the world of modernist cooking is with your raspberry pi / some sensors and extra electronics/mathematica and a crock pot.
  
This initial posting will cover the very basic building blocks necessary for connecting to temperature gauges needed + turning on/off the crock pot.

I hope that through the community postings we can all develop a full fledge solution using the Raspberry Pi that can allow you to monitor the temperature of the water bath, food, setting up the off/on temperatures, chart the cooking process, send SMS text /email when food is done, etc.

Let&amp;#039;s start with the basics.

We&amp;#039;ll need to turn on/off the water bath (crock pot). For that we&amp;#039;ll need to control a relay.
You can build your own circuit. Gaven McDonald&amp;#039;s instructional video is a great starting point to build your own circuit and check how to connect the relay to your Raspberry Pi.
[url=https://www.youtube.com/watch?v=b6ZagKRnRdM]https://www.youtube.com/watch?v=b6ZagKRnRdM[/url]

You can buy a 2-relay module that works for Arduino/Raspberry just like this one.
[url=http://www.sainsmart.com/arduino/arduino-components/relays/arduino-pro-mini.html]http://www.sainsmart.com/arduino/arduino-components/relays/arduino-pro-mini.html[/url]

[url=http://www.sainsmart.com/arduino/arduino-components/relays/arduino-pro-mini.html][img=width: 500px; height: 500px;]/c/portal/getImageAttachment?filename=e11b1280bd72caccef99cdb9d60d4685.jpg&amp;amp;userId=11733[/img][/url]

Opening/Closing the relay is straightforward with Mathematica. Using one of the available pins (ie PIN 17) you can power on /off the crockpot by using the command

[mcode]DeviceWrite[&amp;#034;GPIO&amp;#034;,17-&amp;gt;1][/mcode]
Things get a little bit more challenging with taking temperature readings from the thermocouples.

As BobtheChemist pointed out in his blog[mcode]http://www.bobthechemist.com/index.php/10-stuff/24-thanksgiving-pi[/mcode]
The raspberryPi does not have analog pins in its GPIO (General Purpose Input Output). In his blog entry Bob shows how to overcome this limitation by using a capacitor and measuring how long it takes to charge it.

For this entry, I decided to document the use of an analog to digital converter (ADC). 

Checking out the web I found from several discussions and postings that the MCP 3008 would do the job.

[url=http://www.adafruit.com/products/856]http://www.adafruit.com/products/856
[/url]
We can use this transistor to hook up up to 8 analog sensors into our project. In our case, we&amp;#039;ll only need two. One for the water bath probe and another for the food probe.

The following wire diagram covers how to connect the MCP 3008 to the GPIO (Please focus on the right side of the MCP3008 wiring).
[url=http://learn.adafruit.com/reading-a-analog-in-and-controlling-audio-volume-with-the-raspberry-pi/connecting-the-cobbler-to-a-mcp3008]http://learn.adafruit.com/reading-a-analog-in-and-controlling-audio-volume-with-the-raspberry-pi/connecting-the-cobbler-to-a-mcp3008[/url]

For the thermocouples, you need to watch out on the type of thermocouples you get.

For this specific project the replacement probes for the Maverick ET-73 will work just fine.
[url=http://www.amazon.com/gp/product/B004W8B3PC/ref=oh_details_o00_s00_i00?ie=UTF8&amp;amp;psc=1]http://www.amazon.com/gp/product/B004W8B3PC/ref=oh_details_o00_s00_i00?ie=UTF8&amp;amp;psc=1[/url]

The thermocouples must be connected to the MCP3008 channels CH0 and CH1 in the following manner.

[img=width: 240px; height: 320px;]/c/portal/getImageAttachment?filename=photo.JPG&amp;amp;userId=78214[/img]

We do need to determine the value needed for the fixed resistance. The best value would be equal to the one expected when we reach the cooking temperatures. As I like my steaks medium I chose 60C as the point to use.

Using a thermometer, the thermocouples and a multimeter, I measured the temperature of a ice water glass, hot water and warm water. Using the three points, we can find the function that represents the temperature based on the resistance of the thermocouple.[mcode]temp = {20.6, 42, 83.3} + 273.15
resistance = {220650., 95800., 26340.}
data = Transpose[{Log@resistance, Log@temp}]
lm = LinearModelFit[data, x, x]
lm[{&amp;#034;RSquared&amp;#034;}][/mcode]The model fits very well R^2=0.998

What is the expected resistance at 60C?
[mcode]invdata = Transpose[{Log@temp, Log@resistance}]
Fit[invdata, {1, x}, x]
(*74.3895 - 10.9297 x*)
f[x_] := 74.38949510675315` - 10.929736543045369` x
Exp[f[Log[60 + 273.15]]]
(*54344.9*)[/mcode]Thus I used 56K Resistors for the thermocouples.

Now, to the function needed to read the thermocouples value. We have to probe the analog inputs from the MCP 3008 via the GPIO. 

We can use the library for the MCP 3008 developed by Gabriel Perez-Cerezo
[url=http://gpcf.eu/projects/embedded/adc/]http://gpcf.eu/projects/embedded/adc/[/url]

There are two libraries needed gpio.h and mcp3008.h. 

Dropped them both into /usr/include directory in the Raspberry Pi

The other very important step necessary is exporting the GPIO pins into /sys/class/GPIO, Gabriel also provides the script needed in hist web page. Please make sure to follow his intructions found in the comment section of the script. I forgot to run the [b]update-rc.d -f gpio defaults [/b]command after the installation and spent quite of bit of time after rebooting the equipment several days later. Was getting an error in Mathematica (and a c program to check if the reading was working, kept getting a segmentation fault error) all because the step needed for the script to run at startup was not in place.

Once we have the libraries in place we can address building a function with MathLink to get the readings from the MCP 3008

Please refer to the mathlink developer guide for more details in how it works
[url=http://reference.wolfram.com/mathematica/tutorial/MathLinkDeveloperGuide-Unix.html]http://reference.wolfram.com/mathematica/tutorial/MathLinkDeveloperGuide-Unix.html[/url]

Built the two files needed for the function
adc.tm[code]:Begin:	adc
:Pattern: 	adc[adc_Integer, clock_Integer, in_Integer, out_Integer, cs_Integer]
:Arguments:	{adc, clock, in, out, cs}
:ArgumentTypes:	{Integer, Integer, Integer, Integer, Integer}
:ReturnType:	Integer
:End:


[/code]adc.c
[code]#include &amp;lt;mathlink.h&amp;gt;
#include &amp;lt;mcp3008.h&amp;gt;

int adc(int adc, int clock, int in, int out, int cs) {
return mcp3008_value(adc, clock, in, out, cs);
}

int main(int argc, char *argv[]) {
return MLMain(argc, argv);
}[/code]After creating both files, I proceeded to compile the program with the following command.

This generated the necessary function that can now be invoked from Mathematica or the wolfram engine as follows.
[mcode]Install[&amp;#034;/home/pi/mathematica/adc/adc&amp;#034;];

(*We can now call function adc to read the voltage drop at the thermocouple
The voltage reading will be read by the MCP as a value between 0 (0V)to 1023 (3.3V) *)
(* Analog Channel = 0, ClockPin = 18, In = 23, Out =24, CS = 25 *)
adc[0, 18, 23, 24, 25]

(*The following function translates the voltage reading to temperature in Celsius*)

temp[channel_] := 

 Module[{R2 = 56000, a = -0.0913946, b = 6.80504, R1, 
   x = adc[channel, 18, 23, 24, 25]},
  R1 = (1024 - x) R2/x ; Exp[a Log[R1] + b] - 273.15]

(*Function datapoints is used to collect temperature readings in a matrix of length maxPoints. It also controls the relay
 to turn on the crockpot when the temperature reading is below the setpoint and turn it on when above the set point*)

datapoints[myList_List, fn_, maxLength_Integer, setPoint_Integer] := 

 Module[{x, val = fn},
  x = Append[myList, {DateList[], fn}];
  If[val &amp;lt; setPoint, DeviceWrite[&amp;#034;GPIO&amp;#034;, 17 -&amp;gt; 0], 
   DeviceWrite[&amp;#034;GPIO&amp;#034;, 17 -&amp;gt; 1]];
  If[Length[x] &amp;gt; maxLength, x = Take[x, -maxLength], x]]

data={};

(*Using a Chart to establish the setpoint and graph the temperature trend *)

Manipulate[

 DateListPlot[Refresh[data = datapoints[data, temp[0], 300, setPoint], 
   UpdateInterval -&amp;gt; 15, TrackedSymbols -&amp;gt; {}], Joined -&amp;gt; True, 
  PlotRange -&amp;gt; {Automatic, {20, 100}}, 
  GridLines -&amp;gt; {Automatic, {setPoint}}], {{setPoint, 60}, 30, 80, 1, 
  Appearance -&amp;gt; &amp;#034;Labeled&amp;#034;}]

[/mcode][img=width: 407px; height: 336px;]/c/portal/getImageAttachment?filename=7.png&amp;amp;userId=78214[/img]
This is a link to the video that shows the program running and controlling the relay.
[url=http://youtu.be/4ae42ctVZuk]http://youtu.be/4ae42ctVZuk[/url]

Next challenge will be to use the Web Server functionality of the Raspi to interact with Mathematica so as to control the set point and chart the temperature curve via a web page.
... to be continued.</description>
    <dc:creator>Diego Zviovich</dc:creator>
    <dc:date>2013-12-14T07:37:04Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/453169">
    <title>Wolfram Data Drop and the Raspberry Pi for Education</title>
    <link>https://community.wolfram.com/groups/-/m/t/453169</link>
    <description>Yesterday, [Stephen Wolfram announced][1] the release of this great solution for storing and sharing data coming from sensors, devices, programs, humans or anything else: the [Wolfram Data Drop][2]. I think this is a turning point; it completely changes the game on how I&amp;#039;ve been interacting with streams of data. In this post I want to share with you three ideas that I&amp;#039;ve been exploring using the [Raspberry Pi 2][3], which, by the way, it runs Mathematica about 10x faster than its predecessors!&#xD;
&#xD;
![Data Drop on the Raspberry Pi][4]&#xD;
![enter image description here][5]&#xD;
![enter image description here][6]&#xD;
&#xD;
The first idea that came to my mind was to revisit some of the experiments I had carried out in the past, like this [home alarm system][7]. In a matter of minutes, I was able to set up an activity tracker for my home&amp;#039;s hall. Every time I pass by, the [PIR motion sensor][8] adds a 1 to the &amp;#034;mov&amp;#034; variable that is being dropped to [this databin][9] every 20 minutes. Check it out in [W|A][10], it&amp;#039;s live and growing! [== Data drop 3v1UbpOM][11]&#xD;
&#xD;
![My home hall&amp;#039;s activity][12]&#xD;
![Periodic entries][13]&#xD;
&#xD;
This is such a great thing, that dataset is just about me but it could be monitoring whatever you want, like your cat&amp;#039;s crazy habits. For this example the data is being logged periodically but you could set it up in an event-based manner. Like here, whenever a movement is detected, it triggers the [RaspiCam][15] and it sends the snapshot to the following databin:&#xD;
&#xD;
![RaspiCam Databin][16]&#xD;
![Cumulative Activity Plot][17]&#xD;
&#xD;
What about making a several days long time-lapse of you?&#xD;
&#xD;
![Daily work at home][18]&#xD;
&#xD;
Or what about using other sensors? The possibilities are just endless!&#xD;
&#xD;
Finally, let me end up with the following collaborative activity for the classroom. Here is how you can carry it out. &#xD;
&#xD;
First, you create a public databin to add two different names of animals that the students will enter:&#xD;
&#xD;
    CreateDatabin[ &amp;#034;Interpretation&amp;#034; -&amp;gt; {&amp;#034;animal1&amp;#034; -&amp;gt; &amp;#034;Animal&amp;#034;, &amp;#034;animal2&amp;#034; -&amp;gt; &amp;#034;Animal&amp;#034;}, &amp;lt;|&amp;#034;Name&amp;#034; -&amp;gt; &amp;#034;Classroom Zoo&amp;#034;|&amp;gt;]&#xD;
&#xD;
Then, ask your students to submit their favorite animals&amp;#039; names, using the web-based platform http://wolfr.am/3zCzVgPJ&#xD;
&#xD;
![Add new entry][19]&#xD;
&#xD;
Their individual entries will ended up generating things similar to this amazing [Graph][20]!&#xD;
&#xD;
    data = Values[Databin[&amp;#034;3zCzVgPJ&amp;#034;]];&#xD;
    pairs = Apply[Rule, Drop[Transpose[{data[&amp;#034;animal1&amp;#034;], data[&amp;#034;animal2&amp;#034;]}], 9], {1}]&#xD;
![Name pairs][21]&#xD;
&#xD;
    pics = Map[# -&amp;gt; #[&amp;#034;Image&amp;#034;] &amp;amp;, Union[Flatten[Drop[Transpose[{data[&amp;#034;animal1&amp;#034;], data[&amp;#034;animal2&amp;#034;]}], 9]]]];&#xD;
    style ={ VertexSize-&amp;gt;1.2,EdgeStyle-&amp;gt;Directive[Arrowheads[{{.02,.6}}],Hue[.4,1,.3]],VertexShape-&amp;gt;pics};&#xD;
    Graph[pairs, style, ImageSize -&amp;gt; 900]&#xD;
![Animals Graph][22]&#xD;
&#xD;
Please, give [it a try][23]. Later, we will see what the giant graph ends up looking like. Or even more fun, share with us your ideas or databins that you want to be filled out collaboratively!&#xD;
&#xD;
&#xD;
  [1]: http://blog.wolfram.com/2015/03/04/the-wolfram-data-drop-is-live/&#xD;
  [2]: https://datadrop.wolframcloud.com/&#xD;
  [3]: http://www.wolfram.com/raspberry-pi/&#xD;
  [4]: /c/portal/getImageAttachment?filename=RaspberryPi_Data_Drop.png&amp;amp;userId=56204&#xD;
  [5]: /c/portal/getImageAttachment?filename=raspberry-pi-01.png&amp;amp;userId=56204&#xD;
  [6]: /c/portal/getImageAttachment?filename=raspberry-pi-02.png&amp;amp;userId=56204&#xD;
  [7]: http://community.wolfram.com/groups/-/m/t/226163&#xD;
  [8]: http://www.adafruit.com/products/189&#xD;
  [9]: http://wolfr.am/3v1UbpOM&#xD;
  [10]: https://www.wolframalpha.com/input/?i=Data%20drop%203v1UbpOM&#xD;
  [11]: https://www.wolframalpha.com/input/?i=Data%20drop%203v1UbpOM&#xD;
  [12]: /c/portal/getImageAttachment?filename=activity.png&amp;amp;userId=56204&#xD;
  [13]: /c/portal/getImageAttachment?filename=entries.png&amp;amp;userId=56204&#xD;
  [14]: /c/portal/getImageAttachment?filename=MyHomeHallActivity.png&amp;amp;userId=56204&#xD;
  [15]: http://community.wolfram.com/groups/-/m/t/157704&#xD;
  [16]: /c/portal/getImageAttachment?filename=HomeActivity.jpg&amp;amp;userId=56204&#xD;
  [17]: /c/portal/getImageAttachment?filename=cumulativePlot.png&amp;amp;userId=56204&#xD;
  [18]: /c/portal/getImageAttachment?filename=timelapse_Bernat.gif&amp;amp;userId=56204&#xD;
  [19]: /c/portal/getImageAttachment?filename=inputDD.png&amp;amp;userId=56204&#xD;
  [20]: http://reference.wolfram.com/language/ref/Graph.html&#xD;
  [21]: /c/portal/getImageAttachment?filename=ClassZoo.jpg&amp;amp;userId=56204&#xD;
  [22]: /c/portal/getImageAttachment?filename=AnimalsGraph.jpg&amp;amp;userId=56204&#xD;
  [23]: http://wolfr.am/3zCzVgPJ</description>
    <dc:creator>Bernat Espigulé</dc:creator>
    <dc:date>2015-03-05T15:01:41Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/505425">
    <title>Train Detector - Where to Go Next?</title>
    <link>https://community.wolfram.com/groups/-/m/t/505425</link>
    <description>I built a train detector (OK, I hooked up a microphone to a Raspberry Pi 2). I record CD quality sound (44.1 kHz) to a sound file and run it through a C program to perform a FFT. You can then easily detect the signature of train whistles from the frequency spectrum. I then record and display the data using a conventional Spring Boot / MySQL application. All of this is running on a Raspberry Pi 2 with a 32 GB sim card. &#xD;
&#xD;
You can see it working at: [**Train Counter**][1]&#xD;
&#xD;
**Code is at:** https://github.com/geocolumbus/traindetector&#xD;
&#xD;
I&amp;#039;ve gotten as far as recording the time between trains, **and now I want to predict when the next train will come**. I think this is a classic statistics problem - &amp;#034;time between events&amp;#034; and &amp;#034;predict the probability of the next event&amp;#034;. **Any ideas on how to proceed?**&#xD;
&#xD;
Sample data are attached.&#xD;
&#xD;
![enter image description here][2]&#xD;
&#xD;
&#xD;
  [1]: http://www.traincounter.com&#xD;
  [2]: /c/portal/getImageAttachment?filename=2015-05-27_12-39-52.png&amp;amp;userId=11733</description>
    <dc:creator>George Campbell</dc:creator>
    <dc:date>2015-05-27T13:07:50Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/1153218">
    <title>Controlling a Meccano G15 KS robot with the EZ-Robot system</title>
    <link>https://community.wolfram.com/groups/-/m/t/1153218</link>
    <description>![Controlling a Meccano G15 KS robot with the EZ-Robot system][1]&#xD;
&#xD;
Hi there, (Sharing an idea)&#xD;
&#xD;
I am a hobbyist user of Mathematica.  I am using it to control my three robot systems. Firstly there is Jeeves. It is a modified Meccano G15 KS. The Meccano control system has been removed and replaced with another control system (EZ-Robot). This new controller and its PC based control software is capable of starting an external program. In my case that external program is Mathematica. The sensors on the robot read in data and pass these data onto the PC based control software which then fires up Mathematica which in turn carries out some computation and passes a result back to the PC based control software. Data is stored in files. Using this set up means there can be latency. Sometimes quite a bit. For me this is not a problem as the robot is mainly used to test algorithms. Waiting a minute for a result is not an issue.&#xD;
![Modified Meccano G15 KS][2]&#xD;
&#xD;
&#xD;
This robot can read text from a sheet of paper or screen and then repeat what it has seen. It can also read text and evaluate it. For example given the question, &amp;#034;What is the capital of England&amp;#034; the robot will reply &amp;#034;London&amp;#034;.  It can recognise objects in an image(potentially thousands).  By reading in text similar to the following &amp;#034;AABBCCD&amp;#034;, it will play sounds relating to the musical notes.  All this is made possible because of Mathematica. As stated above I use this robot for algorithm testing.  In this case Mathematica is used indirectly.&#xD;
&#xD;
My second robot is a small 16 DOF humanoid robot. This robot is controlled by a 24 channel Pololu Maestro servo controller.  &#xD;
![Modified EZ-Robot][3]&#xD;
![Rear of modified EZ][4]&#xD;
&#xD;
Now this robot is controlled directly by Mathematica.  All code is contained in a notebook and uses all the features of connected devices contained within Mathematica.  Data can be read from and sent to the robot. Just by using the Manipulate function every servo in the robot can be controlled by Mathematica.  I have not done so at the moment but to create a robot animation system with Mathematica would only be a few dozen lines of code.  That is one of my first tasks. This robot is a modified EZ-Robot now totally under the control of Mathematica.&#xD;
&#xD;
My third robot is a robot workbench.  Built by me to test more complex algorithms.  This is controlled by a 24 channel Pololu Maestro servo controller.  In turn controlled completely by, you guessed Mathematica.&#xD;
&#xD;
![Robot Workbench 1][5]&#xD;
&#xD;
![Robot Workbench 2][6]&#xD;
&#xD;
![Robot Workbench 3][7]&#xD;
&#xD;
![Robot Workbench 4][8]&#xD;
&#xD;
![Robot Workbench 5][9]&#xD;
&#xD;
&#xD;
The robot workbench only has a servo control program written at the moment.  This control program makes use of the Manipulate function. I have some great plans for this workbench in the future. Under the control of Mathematica it will perform tasks using all the power of the neural networks and machine learning.  Image and text processing. To name but a few.&#xD;
&#xD;
My big goal is to create a Robot Operating System that will contain all the code required for the robots to carry out complex tasks. This would include features such as inverse and forward kinematics. I want to achieve this using only Mathematica.&#xD;
&#xD;
My reason for putting my robots on the forum is simply to see if anyone is doing anything similar. With only a few lines of Mathematica code I have been able to get the robots to read text and interpret it. I am sure that my Robot Operating System will be thousands of lines of code. I plan to write it in two parts. A front end to carry out general robot tasks, such as movement and manipulation, data processing from sensors and the solving of problems using neural nets and machine learning. A back end that can be made to match up to a specific controller or micro controller. That is my plan. Any code I produce will be made available to this forum.&#xD;
&#xD;
Well I had better get back to coding.&#xD;
&#xD;
&#xD;
----------&#xD;
&#xD;
&#xD;
# CODE&#xD;
&#xD;
&#xD;
----------&#xD;
&#xD;
&#xD;
The code for the G15 KS simply performs a task and returns a result to the control software of the robot. There is no direct control. The humanoid robot and the workbench code have direct control.&#xD;
&#xD;
The following is the code that allows the modified G15 KS to read text and repeat what it has seen.&#xD;
&#xD;
    str = TextRecognize[Import[&amp;#034;C:\\mathscripts\\images\\img3.jpg&amp;#034;],  Language -&amp;gt; &amp;#034;English&amp;#034;];&#xD;
    &#xD;
    str = StringReplace[str, Whitespace -&amp;gt; &amp;#034; &amp;#034;];&#xD;
    &#xD;
    str&#xD;
    &#xD;
    s = OpenWrite[File[&amp;#034;C:\\mathscripts\\jeeves\\output.txt&amp;#034;]]&#xD;
    &#xD;
    WriteLine[s, str];&#xD;
    &#xD;
    Close[s];&#xD;
&#xD;
Here the image taken by the robots camera is picked up by mathematica and the TextRecognize function gets the text from the image and stores it in a file ready to be used by the robots control system.  Pretty simple coding to get a robot to read text.&#xD;
&#xD;
The following code allows the G15 KS to read text, evaluate it.&#xD;
&#xD;
    str = TextRecognize[Import[&amp;#034;C:\\mathscripts\\images\\img3.jpg&amp;#034;], Language -&amp;gt; &amp;#034;English&amp;#034;];&#xD;
    &#xD;
    str = StringReplace[str, Whitespace -&amp;gt; &amp;#034; &amp;#034;];&#xD;
    &#xD;
    str&#xD;
    &#xD;
    s = OpenWrite[File[&amp;#034;C:\\mathscripts\\jeeves\\output2.txt&amp;#034;]]&#xD;
    &#xD;
    res = Interpreter[&amp;#034;SemanticExpression&amp;#034;][str];&#xD;
    &#xD;
    If[NumericQ[res], res = N[res], res]&#xD;
    &#xD;
    WriteLine[s, ToString[res]];&#xD;
    &#xD;
    Close[s];&#xD;
&#xD;
The following code allows the G15 KS to identify an object within an image. Actually thousands of objects&amp;#039;&#xD;
&#xD;
    txt = ImageIdentify[Import[&amp;#034;C:\\mathscripts\\images\\img3.jpg&amp;#034;]]&#xD;
    &#xD;
    s = OpenWrite[File[&amp;#034;C:\\mathscripts\\jeeves\\output3.txt&amp;#034;]]&#xD;
    &#xD;
    WriteLine[s, ToString[CommonName[txt]]];&#xD;
    &#xD;
    Close[s];&#xD;
&#xD;
The following code allows the G15 KS to read text relating to geographic location. This was used by the robot to read the text generated by a mobile phone app and speak the result.&#xD;
&#xD;
    str = TextRecognize[Import[&amp;#034;C:\\mathscripts\\images\\img3.jpg&amp;#034;], Language -&amp;gt; &amp;#034;English&amp;#034;];&#xD;
    str = StringReplace[str, Whitespace -&amp;gt; &amp;#034; &amp;#034;];&#xD;
    str = StringSplit[str];&#xD;
    If[NumberQ[ToExpression[str[[1]]]] &amp;amp;&amp;amp; NumberQ[ToExpression[str[[2]]]],&#xD;
      $GeoLocation = &#xD;
       GeoPosition[{ToExpression[str[[1]]], ToExpression[str[[2]]]}];&#xD;
      country = CountryData[$GeoLocationCountry, &amp;#034;Name&amp;#034;];&#xD;
      citytown = CityData[$GeoLocationCity, &amp;#034;Name&amp;#034;];&#xD;
      s = OpenWrite[File[&amp;#034;C:\\mathscripts\\jeeves\\output4.txt&amp;#034;]];&#xD;
      WriteLine[s, country &amp;lt;&amp;gt;  &amp;#034; is the country I am in and the nearest town or city is called &amp;#034; \&amp;lt;&amp;gt;  citytown];&#xD;
      Close[s];,&#xD;
      s = OpenWrite[File[&amp;#034;C:\\mathscripts\\jeeves\\output4.txt&amp;#034;]];&#xD;
      WriteLine[s, &amp;#034;Invalid input Please try again &amp;#034; ];&#xD;
      Close[s];];&#xD;
&#xD;
The following code allows the robot to play a tune.  The text the robot must read is as follows: AABBCCD&#xD;
&#xD;
    str = TextRecognize[Import[&amp;#034;C:\\mathscripts\\images\\img3.jpg&amp;#034;], Language -&amp;gt; &amp;#034;English&amp;#034;];&#xD;
    &#xD;
    str = StringReplace[str, Whitespace -&amp;gt; &amp;#034;&amp;#034;];&#xD;
    &#xD;
    str&#xD;
    &#xD;
    s = OpenWrite[File[&amp;#034;C:\\mathscripts\\jeeves\\output5.txt&amp;#034;]]&#xD;
    &#xD;
    WriteLine[s, str];&#xD;
    &#xD;
    Close[s];&#xD;
&#xD;
It can be seen from the code that most of the work is passing data in files.  The code to carry text and image analysis is simply one line of code.  It can not get better than that.&#xD;
&#xD;
The code that follows is the code the is used to control all of the humanoid and workbench servo.  Firstly the workbench.&#xD;
&#xD;
    Button[&amp;#034;Open connection to Maestro&amp;#034;, dev = DeviceOpen[&amp;#034;Serial&amp;#034;, &amp;#034;COM5&amp;#034;]]&#xD;
    &#xD;
    Manipulate[&#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 1, BitAnd[a*4, 127], &#xD;
       BitAnd[BitShiftRight[(a*4), 7] , 127]}], {{a, 1224, &amp;#034;Right Base&amp;#034;}, &#xD;
      496, 2016, 1}]&#xD;
    &#xD;
    Manipulate[&#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 0, BitAnd[a*4, 127], &#xD;
       BitAnd[BitShiftRight[(a*4), 7] , 127]}], {{a, 1224, &amp;#034;Left Base&amp;#034;}, &#xD;
      496, 2016, 1}]&#xD;
    &#xD;
    Manipulate[&#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 3, BitAnd[a*4, 127], &#xD;
       BitAnd[BitShiftRight[(a*4), 7] , 127]}], {{a, 1216, &amp;#034;Right Pivot&amp;#034;},&#xD;
       1008, 2000, 1}]&#xD;
    &#xD;
    Manipulate[&#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 2, BitAnd[a*4, 127], &#xD;
       BitAnd[BitShiftRight[(a*4), 7] , 127]}], {{a, 1216, &amp;#034;Left Pivot&amp;#034;}, &#xD;
      1008, 2000, 1}]&#xD;
    &#xD;
    Manipulate[&#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 4, BitAnd[a*4, 127], &#xD;
       BitAnd[BitShiftRight[(a*4), 7] , 127]}], {{a, 1216, &amp;#034;Right Elbow&amp;#034;},&#xD;
       1024, 2144, 1}]&#xD;
    &#xD;
    Manipulate[&#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 5, BitAnd[a*4, 127], &#xD;
       BitAnd[BitShiftRight[(a*4), 7] , 127]}], {{a, 1270, &amp;#034;Left Elbow&amp;#034;}, &#xD;
      1024, 2144, 1}]&#xD;
    &#xD;
    Manipulate[&#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 14, BitAnd[a*4, 127], &#xD;
       BitAnd[BitShiftRight[(a*4), 7] , 127]}], {{a, 1240, &amp;#034;Right Wrist&amp;#034;},&#xD;
       496, 2000, 1}]&#xD;
    &#xD;
    Manipulate[&#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 15, BitAnd[a*4, 127], &#xD;
       BitAnd[BitShiftRight[(a*4), 7] , 127]}], {{a, 1182, &amp;#034;Left Wrist&amp;#034;}, &#xD;
      496, 2000, 1}]&#xD;
    &#xD;
    Manipulate[&#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 7, BitAnd[a*4, 127], &#xD;
       BitAnd[BitShiftRight[(a*4), 7] , 127]}], {{a, 1200, &#xD;
       &amp;#034;Right Gripper&amp;#034;}, 1024, 1296, 1}]&#xD;
    &#xD;
    Manipulate[&#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 6, BitAnd[a*4, 127], &#xD;
       BitAnd[BitShiftRight[(a*4), 7] , 127]}], {{a, 1200, &#xD;
       &amp;#034;Left Gripper&amp;#034;}, 1024, 1296, 1}]&#xD;
    &#xD;
    Manipulate[&#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 8, BitAnd[a*4, 127], &#xD;
       BitAnd[BitShiftRight[(a*4), 7] , 127]}], {{a, 1480, &amp;#034;Turn Table&amp;#034;}, &#xD;
      992, 2000, 1}]&#xD;
    &#xD;
    Manipulate[&#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 10, BitAnd[a*4, 127], &#xD;
       BitAnd[BitShiftRight[(a*4), 7] , 127]}], {{a, 1240, &#xD;
       &amp;#034;Move Vertical&amp;#034;}, 496, 2000, 1}]&#xD;
    &#xD;
    Manipulate[&#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 11, BitAnd[a*4, 127], &#xD;
       BitAnd[BitShiftRight[(a*4), 7] , 127]}], {{a, 1200, &#xD;
       &amp;#034;Move Horizontal&amp;#034;}, 800, 1600, 1}]&#xD;
    &#xD;
    CloseDevice[dev];&#xD;
&#xD;
Now the code to move the humanoid servos using Manipulate.&#xD;
&#xD;
    dev = DeviceOpen[&amp;#034;Serial&amp;#034;, &amp;#034;COM5&amp;#034;]&#xD;
    Manipulate[&#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 0, BitAnd[a*4, 127], &#xD;
       BitAnd[BitShiftRight[(a*4), 7] , 127]}]; &#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 1, BitAnd[b*4, 127], &#xD;
       BitAnd[BitShiftRight[(b*4), 7] , 127]}]; &#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 2, BitAnd[c*4, 127], &#xD;
       BitAnd[BitShiftRight[(c*4), 7] , 127]}]; &#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 3, BitAnd[d*4, 127], &#xD;
       BitAnd[BitShiftRight[(d*4), 7] , 127]}]; &#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 5, BitAnd[e*4, 127], &#xD;
       BitAnd[BitShiftRight[(e*4), 7] , 127]}]; &#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 4, BitAnd[f*4, 127], &#xD;
       BitAnd[BitShiftRight[(f*4), 7] , 127]}]; &#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 6, BitAnd[g*4, 127], &#xD;
       BitAnd[BitShiftRight[(g*4), 7] , 127]}];&#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 7, BitAnd[h*4, 127], &#xD;
       BitAnd[BitShiftRight[(h*4), 7] , 127]}];&#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 15, BitAnd[l*4, 127], &#xD;
       BitAnd[BitShiftRight[(l*4), 7] , 127]}]; &#xD;
     DeviceWriteBuffer[&#xD;
      dev, {132, 14, BitAnd[k*4, 127], &#xD;
       BitAnd[BitShiftRight[(k*4), 7] , 127]}], {{a, 1224, &amp;#034;Left Base&amp;#034;}, &#xD;
      496, 2016, 1}, {{b, 1224, &amp;#034;Right Base&amp;#034;}, 496, 2016, &#xD;
      1}, {{c, 1216, &amp;#034;Left Pivot&amp;#034;}, 1008, 2000, &#xD;
      1}, {{d, 1216, &amp;#034;Right Pivot&amp;#034;}, 1008, 2000, &#xD;
      1}, {{e, 1270, &amp;#034;Left Elbow&amp;#034;}, 1024, 2144, &#xD;
      1}, {{f, 1270, &amp;#034;Right Elbow&amp;#034;}, 1024, 2144, &#xD;
      1}, {{k, 1024, &amp;#034;Right Wrist&amp;#034;}, 496, 2000, &#xD;
      1}, {{l, 1024, &amp;#034;Left Wrist&amp;#034;}, 496, 2000, &#xD;
      1}, {{g, 1200, &amp;#034;Left Gripper&amp;#034;}, 1024, 1296, &#xD;
      1}, {{h, 1200, &amp;#034;Right Gripper&amp;#034;}, 1024, 1296, 1}]&#xD;
&#xD;
That&amp;#039;s all for now at a later date there will be a lot more to come.  It can be seen from above that very little code is required to get a great deal of functionality.&#xD;
&#xD;
&#xD;
  [1]: https://community.wolfram.com//c/portal/getImageAttachment?filename=Main111120224.png&amp;amp;userId=20103&#xD;
  [2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=20170722_204150%282%29.jpg&amp;amp;userId=1152078&#xD;
  [3]: http://community.wolfram.com//c/portal/getImageAttachment?filename=20170722_210215%282%29.jpg&amp;amp;userId=1152078&#xD;
  [4]: http://community.wolfram.com//c/portal/getImageAttachment?filename=20170722_210324%283%29.jpg&amp;amp;userId=1152078&#xD;
  [5]: http://community.wolfram.com//c/portal/getImageAttachment?filename=20170722_205616%282%29.jpg&amp;amp;userId=1152078&#xD;
  [6]: http://community.wolfram.com//c/portal/getImageAttachment?filename=20170722_205908%282%29.jpg&amp;amp;userId=1152078&#xD;
  [7]: http://community.wolfram.com//c/portal/getImageAttachment?filename=20170722_210128%283%29.jpg&amp;amp;userId=1152078&#xD;
  [8]: http://community.wolfram.com//c/portal/getImageAttachment?filename=20170722_204931%282%29.jpg&amp;amp;userId=1152078&#xD;
  [9]: http://community.wolfram.com//c/portal/getImageAttachment?filename=20170722_205445%282%29.jpg&amp;amp;userId=1152078</description>
    <dc:creator>Terence Smith</dc:creator>
    <dc:date>2017-07-26T17:11:41Z</dc:date>
  </item>
</rdf:RDF>

