<?xml version="1.0" encoding="UTF-8"?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns="http://purl.org/rss/1.0/" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel rdf:about="https://community.wolfram.com">
    <title>Community RSS Feed</title>
    <link>https://community.wolfram.com</link>
    <description>RSS Feed for Wolfram Community showing any discussions tagged with Connected Devices sorted by most viewed.</description>
    <items>
      <rdf:Seq>
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/170725" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/418132" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/246929" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/315748" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/344278" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/1028536" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/1057588" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/196759" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/344241" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/456947" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/23261" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/181641" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/90931" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/1179035" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/992466" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/825781" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/1098055" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/553861" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/408056" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/453169" />
      </rdf:Seq>
    </items>
  </channel>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/170725">
    <title>Building a sous-vide controller using Raspberry Pi / Mathematica</title>
    <link>https://community.wolfram.com/groups/-/m/t/170725</link>
    <description>Sous vide is the method of cooking food in airtight bags using a water bath at a precise temperature.
The method is fantastic to get the meat and seafood cooked evenly at the cooking point you love most.

[url=http://modernistcuisine.com/2013/01/why-cook-sous-vide/]http://modernistcuisine.com/2013/01/why-cook-sous-vide/[/url]
[url=http://www.douglasbaldwin.com/sous-vide.html]http://www.douglasbaldwin.com/sous-vide.html
[/url]
Buying an off the shelf sous vide can run for several hundreds of dollars. One of the many ways you can explore the world of modernist cooking is with your raspberry pi / some sensors and extra electronics/mathematica and a crock pot.
  
This initial posting will cover the very basic building blocks necessary for connecting to temperature gauges needed + turning on/off the crock pot.

I hope that through the community postings we can all develop a full fledge solution using the Raspberry Pi that can allow you to monitor the temperature of the water bath, food, setting up the off/on temperatures, chart the cooking process, send SMS text /email when food is done, etc.

Let&amp;#039;s start with the basics.

We&amp;#039;ll need to turn on/off the water bath (crock pot). For that we&amp;#039;ll need to control a relay.
You can build your own circuit. Gaven McDonald&amp;#039;s instructional video is a great starting point to build your own circuit and check how to connect the relay to your Raspberry Pi.
[url=https://www.youtube.com/watch?v=b6ZagKRnRdM]https://www.youtube.com/watch?v=b6ZagKRnRdM[/url]

You can buy a 2-relay module that works for Arduino/Raspberry just like this one.
[url=http://www.sainsmart.com/arduino/arduino-components/relays/arduino-pro-mini.html]http://www.sainsmart.com/arduino/arduino-components/relays/arduino-pro-mini.html[/url]

[url=http://www.sainsmart.com/arduino/arduino-components/relays/arduino-pro-mini.html][img=width: 500px; height: 500px;]/c/portal/getImageAttachment?filename=e11b1280bd72caccef99cdb9d60d4685.jpg&amp;amp;userId=11733[/img][/url]

Opening/Closing the relay is straightforward with Mathematica. Using one of the available pins (ie PIN 17) you can power on /off the crockpot by using the command

[mcode]DeviceWrite[&amp;#034;GPIO&amp;#034;,17-&amp;gt;1][/mcode]
Things get a little bit more challenging with taking temperature readings from the thermocouples.

As BobtheChemist pointed out in his blog[mcode]http://www.bobthechemist.com/index.php/10-stuff/24-thanksgiving-pi[/mcode]
The raspberryPi does not have analog pins in its GPIO (General Purpose Input Output). In his blog entry Bob shows how to overcome this limitation by using a capacitor and measuring how long it takes to charge it.

For this entry, I decided to document the use of an analog to digital converter (ADC). 

Checking out the web I found from several discussions and postings that the MCP 3008 would do the job.

[url=http://www.adafruit.com/products/856]http://www.adafruit.com/products/856
[/url]
We can use this transistor to hook up up to 8 analog sensors into our project. In our case, we&amp;#039;ll only need two. One for the water bath probe and another for the food probe.

The following wire diagram covers how to connect the MCP 3008 to the GPIO (Please focus on the right side of the MCP3008 wiring).
[url=http://learn.adafruit.com/reading-a-analog-in-and-controlling-audio-volume-with-the-raspberry-pi/connecting-the-cobbler-to-a-mcp3008]http://learn.adafruit.com/reading-a-analog-in-and-controlling-audio-volume-with-the-raspberry-pi/connecting-the-cobbler-to-a-mcp3008[/url]

For the thermocouples, you need to watch out on the type of thermocouples you get.

For this specific project the replacement probes for the Maverick ET-73 will work just fine.
[url=http://www.amazon.com/gp/product/B004W8B3PC/ref=oh_details_o00_s00_i00?ie=UTF8&amp;amp;psc=1]http://www.amazon.com/gp/product/B004W8B3PC/ref=oh_details_o00_s00_i00?ie=UTF8&amp;amp;psc=1[/url]

The thermocouples must be connected to the MCP3008 channels CH0 and CH1 in the following manner.

[img=width: 240px; height: 320px;]/c/portal/getImageAttachment?filename=photo.JPG&amp;amp;userId=78214[/img]

We do need to determine the value needed for the fixed resistance. The best value would be equal to the one expected when we reach the cooking temperatures. As I like my steaks medium I chose 60C as the point to use.

Using a thermometer, the thermocouples and a multimeter, I measured the temperature of a ice water glass, hot water and warm water. Using the three points, we can find the function that represents the temperature based on the resistance of the thermocouple.[mcode]temp = {20.6, 42, 83.3} + 273.15
resistance = {220650., 95800., 26340.}
data = Transpose[{Log@resistance, Log@temp}]
lm = LinearModelFit[data, x, x]
lm[{&amp;#034;RSquared&amp;#034;}][/mcode]The model fits very well R^2=0.998

What is the expected resistance at 60C?
[mcode]invdata = Transpose[{Log@temp, Log@resistance}]
Fit[invdata, {1, x}, x]
(*74.3895 - 10.9297 x*)
f[x_] := 74.38949510675315` - 10.929736543045369` x
Exp[f[Log[60 + 273.15]]]
(*54344.9*)[/mcode]Thus I used 56K Resistors for the thermocouples.

Now, to the function needed to read the thermocouples value. We have to probe the analog inputs from the MCP 3008 via the GPIO. 

We can use the library for the MCP 3008 developed by Gabriel Perez-Cerezo
[url=http://gpcf.eu/projects/embedded/adc/]http://gpcf.eu/projects/embedded/adc/[/url]

There are two libraries needed gpio.h and mcp3008.h. 

Dropped them both into /usr/include directory in the Raspberry Pi

The other very important step necessary is exporting the GPIO pins into /sys/class/GPIO, Gabriel also provides the script needed in hist web page. Please make sure to follow his intructions found in the comment section of the script. I forgot to run the [b]update-rc.d -f gpio defaults [/b]command after the installation and spent quite of bit of time after rebooting the equipment several days later. Was getting an error in Mathematica (and a c program to check if the reading was working, kept getting a segmentation fault error) all because the step needed for the script to run at startup was not in place.

Once we have the libraries in place we can address building a function with MathLink to get the readings from the MCP 3008

Please refer to the mathlink developer guide for more details in how it works
[url=http://reference.wolfram.com/mathematica/tutorial/MathLinkDeveloperGuide-Unix.html]http://reference.wolfram.com/mathematica/tutorial/MathLinkDeveloperGuide-Unix.html[/url]

Built the two files needed for the function
adc.tm[code]:Begin:	adc
:Pattern: 	adc[adc_Integer, clock_Integer, in_Integer, out_Integer, cs_Integer]
:Arguments:	{adc, clock, in, out, cs}
:ArgumentTypes:	{Integer, Integer, Integer, Integer, Integer}
:ReturnType:	Integer
:End:


[/code]adc.c
[code]#include &amp;lt;mathlink.h&amp;gt;
#include &amp;lt;mcp3008.h&amp;gt;

int adc(int adc, int clock, int in, int out, int cs) {
return mcp3008_value(adc, clock, in, out, cs);
}

int main(int argc, char *argv[]) {
return MLMain(argc, argv);
}[/code]After creating both files, I proceeded to compile the program with the following command.

This generated the necessary function that can now be invoked from Mathematica or the wolfram engine as follows.
[mcode]Install[&amp;#034;/home/pi/mathematica/adc/adc&amp;#034;];

(*We can now call function adc to read the voltage drop at the thermocouple
The voltage reading will be read by the MCP as a value between 0 (0V)to 1023 (3.3V) *)
(* Analog Channel = 0, ClockPin = 18, In = 23, Out =24, CS = 25 *)
adc[0, 18, 23, 24, 25]

(*The following function translates the voltage reading to temperature in Celsius*)

temp[channel_] := 

 Module[{R2 = 56000, a = -0.0913946, b = 6.80504, R1, 
   x = adc[channel, 18, 23, 24, 25]},
  R1 = (1024 - x) R2/x ; Exp[a Log[R1] + b] - 273.15]

(*Function datapoints is used to collect temperature readings in a matrix of length maxPoints. It also controls the relay
 to turn on the crockpot when the temperature reading is below the setpoint and turn it on when above the set point*)

datapoints[myList_List, fn_, maxLength_Integer, setPoint_Integer] := 

 Module[{x, val = fn},
  x = Append[myList, {DateList[], fn}];
  If[val &amp;lt; setPoint, DeviceWrite[&amp;#034;GPIO&amp;#034;, 17 -&amp;gt; 0], 
   DeviceWrite[&amp;#034;GPIO&amp;#034;, 17 -&amp;gt; 1]];
  If[Length[x] &amp;gt; maxLength, x = Take[x, -maxLength], x]]

data={};

(*Using a Chart to establish the setpoint and graph the temperature trend *)

Manipulate[

 DateListPlot[Refresh[data = datapoints[data, temp[0], 300, setPoint], 
   UpdateInterval -&amp;gt; 15, TrackedSymbols -&amp;gt; {}], Joined -&amp;gt; True, 
  PlotRange -&amp;gt; {Automatic, {20, 100}}, 
  GridLines -&amp;gt; {Automatic, {setPoint}}], {{setPoint, 60}, 30, 80, 1, 
  Appearance -&amp;gt; &amp;#034;Labeled&amp;#034;}]

[/mcode][img=width: 407px; height: 336px;]/c/portal/getImageAttachment?filename=7.png&amp;amp;userId=78214[/img]
This is a link to the video that shows the program running and controlling the relay.
[url=http://youtu.be/4ae42ctVZuk]http://youtu.be/4ae42ctVZuk[/url]

Next challenge will be to use the Web Server functionality of the Raspi to interact with Mathematica so as to control the set point and chart the temperature curve via a web page.
... to be continued.</description>
    <dc:creator>Diego Zviovich</dc:creator>
    <dc:date>2013-12-14T07:37:04Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/418132">
    <title>Free Wolfram Language on Raspberry Pi tutorial</title>
    <link>https://community.wolfram.com/groups/-/m/t/418132</link>
    <description>*NOTE: the main tutorial notebook is attached at the end of this post and [can be downloaded by clicking here][1].*&#xD;
&#xD;
&#xD;
----------&#xD;
&#xD;
&#xD;
I wanted to share the attached *Mathematica* notebook that I created for teaching kids (ages 9-14) about the Wolfram Language on the Raspberry Pi. It has a simplified (and colorful) interface for students and easy editing tools for teachers to create new content (even those with little or no experience using *Mathematica*). I am extremely grateful for the efforts of Anna Musser who very patiently helped me refine the interface over many iterations and piloted the first workshops using this notebook at Empow Studios!&#xD;
&#xD;
It includes a self-paced tutorial designed for beginning programmers who are young or young-at-heart. It also includes instructions for authoring your own tutorials. The interface is minimally dynamic so the tutorial will run as smooth as possible on the Raspberry Pi model B; if there is interest, then we could build a prettier dynamic interface for more powerful hardware. Please comment below with any improvements/changes that you would like to see and of course please comment or upvote if you find this useful or interesting :)&#xD;
&#xD;
&#xD;
----------&#xD;
## Sample of the attached tutorial:&#xD;
&#xD;
&#xD;
&#xD;
![enter image description here][2]&#xD;
&#xD;
**COMPLETE TUTORIAL NOTEBOOK ATTACHED BELOW**&#xD;
&#xD;
&#xD;
  [1]: https://www.dropbox.com/s/mddbex45h7ynao3/FirstCourseOnRPI.nb?dl=1&#xD;
  [2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-12-05at11.46.57AM.png&amp;amp;userId=11733</description>
    <dc:creator>Kyle Keane</dc:creator>
    <dc:date>2015-01-07T18:16:20Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/246929">
    <title>One pixel thermal imaging camera with Mathematica and Arduino</title>
    <link>https://community.wolfram.com/groups/-/m/t/246929</link>
    <description>Triggered by a leak in my hot water boiler at home I built a thermal imaging camera using an Arduino and interfacing it with Mathematica. I tried to make up for the &amp;#034;one-pixel-resolution&amp;#034; by using Mathematica&amp;#039;s powerful image analysis abilities. This is a work in progress and I would be delighted to get some comments/suggestions from the Community. In this project, I had a lot of help from [url=http://community.wolfram.com/web/bschelter/home]Bjoern Schelter[/url], who has recently joined this Community. If you have the components and use the programs below, you should have a &amp;#034;working&amp;#034; one-pixel thermal camera after 30 minutes or so of DIY. Here&amp;#039;s a sneak peek of what we want to get out (this one is a &amp;#034;selfie&amp;#034;):&#xD;
&#xD;
[img=width: 426px; height: 270px;]/c/portal/getImageAttachment?filename=asdwefasdcsdvvafe2345QT.PNG&amp;amp;userId=11733[/img]&#xD;
&#xD;
I use the following components:[list=1]&#xD;
[*]Arduino Uno R3&#xD;
[*][url=http://www.amazon.co.uk/XINTE-MLX90614ESF-DCI-non-contact-Infrared-Temperature/dp/B00IMU0LXG/ref=sr_1_2?ie=UTF8&amp;amp;qid=1399153918&amp;amp;sr=8-2&amp;amp;keywords=Melexis]MELEXIS / MLX90614ESF-DCI / DS Digital non-contact Infrared Temperature Sensor[/url]  (~ £35 and more or less the same in USD)&#xD;
[*][url=http://www.amazon.co.uk/MG995-Servo-Sensor-Mount-Black/dp/B00EZIYCUW/ref=sr_1_3?ie=UTF8&amp;amp;qid=1399153991&amp;amp;sr=8-3&amp;amp;keywords=Pan+tilt]MG995 Servo Sensor Mount Kit 2 DOF Pan and Tilt Black[/url] (~ £24, similar in USD)&#xD;
[*]Two 4.7 kOhm resistors.&#xD;
[*]One 0.1 uF capacitor.&#xD;
[*]One small breadboard.&#xD;
[*]5V power source.&#xD;
[*]Wires. &#xD;
[/list]The idea is illustrated in this [url=http://www.youtube.com/watch?v=rcTKVOzxCmw]Youtube video[/url]. To my best knowledge, the original idea comes from a [url=http://www.theimagingsource.com/en_US/blog/posts/20090622/]project of Steffen Strobel in the German science competition &amp;#034;Jugend Forscht&amp;#034;[/url]. The main idea is to mount a non-contact temperature sensor on a pan and tilt mechanism (i.e. two servos) on a tripod. An Arduino microcontroller is then used to communicate via the serial port with Mathematica, which is used to control the servos and triggers the measurements. After the data acquisition Mathematica cleans the data and produces some thermal images (see below).&#xD;
&#xD;
We use the following wiring diagram to connect the servos and the temperature sensor to the Arduino.&#xD;
[center][img=width: 300px; height: 203px;]/c/portal/getImageAttachment?filename=ThermoCam.jpg&amp;amp;userId=48754[/img][/center]&#xD;
The resistors are 4.7kOhm and the capacitor is 0.1uF. The sensor part is taken from the [url=http://bildr.org/2011/02/mlx90614-arduino/]bildr.blog[/url], which also shows how to make Arduino talk to the sensor. The Melexis sensor that we chose has a temperature resolution of 0.02 degrees Celsius and a rather narrow field of view, which is important for our application. &#xD;
&#xD;
For the servo part, we use the standard servo.h library; an example of its application can be found [url=http://arduino.cc/en/Tutorial/sweep]here[/url].&#xD;
&#xD;
Here is a photo of the sensor/head of the device.&#xD;
[center][img=width: 320px; height: 240px;]/c/portal/getImageAttachment?filename=photo.JPG&amp;amp;userId=48754[/img][/center]&#xD;
The entire device looks like this.&#xD;
[center][img=width: 240px; height: 320px;]/c/portal/getImageAttachment?filename=7033photo4.JPG&amp;amp;userId=48754[/img][/center]&#xD;
The idea is to use Mathematica to send instructions to the servos and to initiate the measurements. To interface Mathematica with Arduino we use the [url=http://library.wolfram.com/infocenter/Demos/5726/]SerialIO package[/url]. I found [url=http://williamjturkel.net/2011/12/25/connecting-arduino-to-mathematica-on-mac-os-x-with-serialio/]this website by William Turkel[/url] very useful to make SerialIO work on my Mac; following the steps and adapting some directories makes the package work without any problems.&#xD;
&#xD;
At that point, we have everything in place, and only need to put the bits together. We first need to upload this piece of code (also attached at the bottom) to the Arduino.&#xD;
[code]#include &amp;lt;i2cmaster.h&amp;gt;&#xD;
#include &amp;lt;Servo.h&amp;gt; &#xD;
&#xD;
//Servo setup&#xD;
int servoPin1 = 9;&#xD;
int servoPin2 = 10; &#xD;
Servo servo1;  &#xD;
Servo servo2;&#xD;
int angle1 = 40;   // servo start positions in degrees &#xD;
int angle2 = 50;&#xD;
&#xD;
&#xD;
//Melexis setup&#xD;
int sensor = 0;&#xD;
int inByte = 0;&#xD;
&#xD;
&#xD;
void setup()&#xD;
{&#xD;
	Serial.begin(9600);&#xD;
	&#xD;
       // attach pan-tilt servos&#xD;
       servo1.attach(servoPin1);&#xD;
       servo2.attach(servoPin2); &#xD;
&#xD;
&#xD;
       servo1.write(angle1);&#xD;
       servo2.write(angle2);&#xD;
&#xD;
&#xD;
	//Initialise the i2c bus&#xD;
	i2c_init(); &#xD;
	PORTC = (1 &amp;lt;&amp;lt; PORTC4) | (1 &amp;lt;&amp;lt; PORTC5);//enable pullups&#xD;
       establishContact();&#xD;
}&#xD;
&#xD;
&#xD;
void loop()&#xD;
{&#xD;
 if (Serial.available() &amp;gt; 0) &#xD;
  {&#xD;
   inByte = Serial.read();&#xD;
   &#xD;
   int dev = 0x5A&amp;lt;&amp;lt;1;&#xD;
   int data_low = 0;&#xD;
   int data_high = 0;&#xD;
   int pec = 0;&#xD;
&#xD;
&#xD;
   i2c_start_wait(dev+I2C_WRITE);&#xD;
   i2c_write(0x07);&#xD;
&#xD;
&#xD;
   // read&#xD;
   i2c_rep_start(dev+I2C_READ);&#xD;
   data_low = i2c_readAck(); //Read 1 byte and then send ack&#xD;
   data_high = i2c_readAck(); //Read 1 byte and then send ack&#xD;
   pec = i2c_readNak();&#xD;
   i2c_stop();&#xD;
&#xD;
&#xD;
   //This converts high and low bytes together and processes temperature, MSB is a error bit and is ignored for temps&#xD;
   double tempFactor = 0.02; // 0.02 degrees per LSB (measurement resolution of the MLX90614)&#xD;
   double tempData = 0x0000; // zero out the data&#xD;
   int frac; // data past the decimal point&#xD;
&#xD;
&#xD;
 // Serial.print(tempData);&#xD;
 // Serial.write(inByte);&#xD;
   // This masks off the error bit of the high byte, then moves it left 8 bits and adds the low byte.&#xD;
   tempData = (double)(((data_high &amp;amp; 0x007F) &amp;lt;&amp;lt; 8) + data_low);&#xD;
   tempData = (tempData * tempFactor)-0.01;&#xD;
&#xD;
&#xD;
   //inByte = (float)(((data_high &amp;amp; 0x007F) &amp;lt;&amp;lt; 8) + data_low);&#xD;
&#xD;
&#xD;
  float celsius = tempData - 273.15;&#xD;
   sensor=(int)(celsius*100);&#xD;
   //float fahrenheit = (celsius*1.8) + 32;&#xD;
&#xD;
  Serial.print(sensor);&#xD;
  &#xD;
   &#xD;
   // horizontal &amp;#034;H&amp;#034;-&amp;gt; 72; reverse &amp;#034;R&amp;#034;-&amp;gt; 82; vertical &amp;#034;V&amp;#034;-&amp;gt; 86; end &amp;#034;E&amp;#034;-&amp;gt; 69&#xD;
   &#xD;
  if(inByte==72)&#xD;
  {&#xD;
   angle1=angle1+1;&#xD;
   servo1.write(angle1);&#xD;
  }&#xD;
   if(inByte==82)&#xD;
  {&#xD;
   angle1=40;&#xD;
   servo1.write(angle1);&#xD;
  }&#xD;
  if(inByte==86)&#xD;
  {&#xD;
   angle2=angle2+1;&#xD;
   servo2.write(angle2);&#xD;
  }&#xD;
    if(inByte==69)&#xD;
  {&#xD;
   angle1 = 40;   // servo back to start&#xD;
   angle2 = 50;&#xD;
   servo1.write(angle1);&#xD;
   servo2.write(angle2);&#xD;
  }&#xD;
  &#xD;
   delay(15); // 15 works; wait 15 milliseconds before printing again&#xD;
 }&#xD;
&#xD;
&#xD;
}&#xD;
&#xD;
&#xD;
&#xD;
void establishContact() &#xD;
{&#xD;
 while (Serial.available() &amp;lt;= 0) &#xD;
 {&#xD;
   Serial.print(&amp;#039;A&amp;#039;);&#xD;
   delay(100);&#xD;
 }&#xD;
}&#xD;
[/code]&#xD;
The idea is to make Mathematica communicate with the Arduino via the serial connection. The Arduino sketch shows that Ardunio is waiting for instructions, e.g. &amp;#034;H&amp;#034; to move horizontally, &amp;#034;V&amp;#034; to move vertically and &amp;#034;E&amp;#034; to go to the end position. &#xD;
[mcode](*First we load the SerialIO package. See instructions above.*)&#xD;
&#xD;
&amp;lt;&amp;lt; SerialIO`&#xD;
&#xD;
(*We test whether Mathematica&amp;#039;s applications folder is in the Path. On some Macs Mathematica will be in the /Library directory - used in this example- and in others in the /Users/username/Library directory, where &amp;#034;username&amp;#034; needs to be replaced by the correct user name.*)&#xD;
&#xD;
MemberQ[$Path, &amp;#034;/Library/Mathematica/Applications&amp;#034;]&#xD;
&#xD;
(*If this gives True all is fine. If it evaluates to False execute&#xD;
AppendTo[$Path, &amp;#034;/Library/Mathematica/Applications&amp;#034;]&#xD;
*)&#xD;
&#xD;
(*Connect to the Arduino*)&#xD;
&#xD;
myArduino = &#xD;
  SerialOpen[Quiet[FileNames[&amp;#034;tty.usb*&amp;#034;, {&amp;#034;/dev&amp;#034;}, Infinity]][[1]]];&#xD;
SerialSetOptions[myArduino, &amp;#034;BaudRate&amp;#034; -&amp;gt; 9600];&#xD;
While[SerialReadyQ[myArduino] == False, Pause[0.1]];&#xD;
&#xD;
(*Data collection, in this case 40 vertical and 70 horizontal pixels; runtime 2-3 minutes; pauses cannot be reduced much further.*)&#xD;
&#xD;
pixels = {}; SerialRead[myArduino]; For[j = 1, j &amp;lt; 41, j++, &#xD;
 For[i = 1, i &amp;lt; 71, i++, SerialWrite[myArduino, &amp;#034;H&amp;#034;]; &#xD;
  AppendTo[pixels, (SerialRead[myArduino] // ToExpression)/100.]; &#xD;
  Pause[0.1]]; SerialWrite[myArduino, &amp;#034;R&amp;#034;]; &#xD;
 SerialWrite[myArduino, &amp;#034;V&amp;#034;]; SerialRead[myArduino]; &#xD;
 Pause[0.1];]; SerialWrite[myArduino, &amp;#034;E&amp;#034;];&#xD;
&#xD;
(*After the data aquisition close the connection to Arduino*)&#xD;
SerialClose[myArduino]&#xD;
&#xD;
(*Now we can use several different ways to represent the data, note that some point at the beginning/end of the scanned lines are removed; there were too many measurements errors just after the &amp;#034;carriage return&amp;#034;*)&#xD;
&#xD;
ArrayPlot[Partition[Reverse[pixels], 70][[All, 2 ;; -10]], &#xD;
 ColorFunction -&amp;gt; &amp;#034;Rainbow&amp;#034;]&#xD;
&#xD;
(*here&amp;#039;s another colour scheme.*)&#xD;
ArrayPlot[Partition[Reverse[pixels], 70][[All, 2 ;; -10]], &#xD;
&#xD;
(*Occasionally there are some outliers in the measurements; here we clean them out.*)&#xD;
ArrayPlot[&#xD;
 Partition[Reverse[pixels /. x_ /; x &amp;gt; 35. -&amp;gt; 35.], 70][[All, &#xD;
   2 ;; -10]], ColorFunction -&amp;gt; &amp;#034;Temperature&amp;#034;]&#xD;
 ColorFunction -&amp;gt; &amp;#034;Temperature&amp;#034;]&#xD;
&#xD;
(*This last one uses interpolation to make the image smoother.*)&#xD;
&#xD;
ListContourPlot[&#xD;
 Partition[Reverse[Log /@ pixels /. x_ /; x &amp;gt; 35. -&amp;gt; 35.], &#xD;
   70][[-1 ;; 1 ;; -1, 1 ;; -10]], AspectRatio -&amp;gt; 0.9, &#xD;
 ColorFunction -&amp;gt; &amp;#034;Rainbow&amp;#034;, PlotRange -&amp;gt; All, &#xD;
 InterpolationOrder -&amp;gt; 2, Contours -&amp;gt; 60, ContourStyle -&amp;gt; None][/mcode][center][/center]So here&amp;#039;s a photo of my broken boiler and its scan:&#xD;
[center][img=width: 518px; height: 257px;]/c/portal/getImageAttachment?filename=BoilerScan.jpg&amp;amp;userId=48754[/img][/center]&#xD;
Because of the scanning procedure (which just looks at the angle and does not use any projection), the scan is slightly distorted, but it is possible to recognize the main features and even the sticker on the front!&#xD;
&#xD;
It appears that this rather primitive device can also be used to analyse electrical components. Here is an image of my MacBook Pro. [center][img=width: 450px; height: 306px;]/c/portal/getImageAttachment?filename=Laptop1.jpg&amp;amp;userId=48754[/img][/center]&#xD;
 The position of the CPU becomes quite obvious.&#xD;
&#xD;
There are many things that need to be improved: &#xD;
&#xD;
(i) First, there is the projection issue. The scanner does scan angles. It needs to be projected to a 2D plane. One might use an ultrasonic distance sensor to get better results.&#xD;
(ii) The device needs to be calibrated.&#xD;
(iii) A user interface is needed. It would be useful to click on the image and get the temperature reading.&#xD;
(iv) The communication between Mathematica and the Arduino need to be improved. The starting position of 40/50 degrees is hard-coded into the Arduino sketch. It should be done by the Mathematica code.&#xD;
(v) We have not even started to use Mathematica&amp;#039;s features on this. Much image processing could be done. The image should be overlayed to a normal photo of the object that is scanned. Manipulate could be used to change thresholds, i.e. the threshold to cut-off outliers, which is currently set to 35 degrees. &#xD;
(vi) The speed might be improved. I suppose that the scanning time of 3 minutes or so is typical for these devices, but one might improve that a bit. Also, Mathematica could use edge-detection to determine regions where a higher scan density would be helpful to get a better resolution. This only makes sense if the servos could be directed to a certain position much more precisely; alternatively, we could use random positions, which then are precisely determined using an accelerometer or so.&#xD;
&#xD;
There is of course much more to do. In spite of this being work in progress, I wanted to share this project, and hope for helpful comments.&#xD;
&#xD;
I attach the Mathematica notebook. I have a movie of the scanning process and the actual arduino sketch which I cannot upload directly. Here are links to the [url=https://www.dropbox.com/s/sm2prb5w0wrqexp/ThermoCamera_Forum.zip]arduino sketch[/url] and the [url=https://www.dropbox.com/s/wpgpww8a7jhjfn0/Scan.MOV]scanning movie[/url].&#xD;
&#xD;
M.</description>
    <dc:creator>Marco Thiel</dc:creator>
    <dc:date>2014-05-04T00:48:33Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/315748">
    <title>Programming the world with Arduino and Wolfram Language</title>
    <link>https://community.wolfram.com/groups/-/m/t/315748</link>
    <description>&amp;amp;[Wolfram Notebook][1]&#xD;
&#xD;
&#xD;
  [1]: https://www.wolframcloud.com/obj/4d655a10-832e-4ace-b937-6d5a702289a3</description>
    <dc:creator>Ian Johnson</dc:creator>
    <dc:date>2014-08-10T17:56:06Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/344278">
    <title>Using your smart phone as the ultimate sensor array for Mathematica</title>
    <link>https://community.wolfram.com/groups/-/m/t/344278</link>
    <description>Many fantastic posts in this community describe how to connect external devices to Mathematica and how to read the data. Connecting Mathematica to an Arduino for example allows you to read and then work with data from all kinds of sensors. In most of the cases, when we speak about connected devices, additional hardware is necessary. Smart phones, on the other hand, are our permanent companions and they host a wide array of sensors that we can tap into with Mathematica. For this post, I will be using an iPhone 5 - but a similar approach can be taken with many other smart phones. [Björn Schelter][1] and myself have worked on this together.&#xD;
&#xD;
The first thing we need in order to be able to read the iPhone is a little App which can be purchased on the iTunes App store: it is called [Sensor Data][2]. When you open the app you see a screen like this one. &#xD;
&#xD;
![enter image description here][3]&#xD;
&#xD;
At the top of the screen you see an IP address and a port number (after the colon!). These numbers will be important to connect to the phone and either download data or stream sensor data directly. If you click on the &amp;#034;start capture&amp;#034; the iPhone&amp;#039;s data will be stored on the phone and can be downloaded into Mathematica. In this post we are rather interested in the &amp;#034;Streaming&amp;#034; function. If you click on the respective button on the bottom you get to a screen like this:&#xD;
&#xD;
![enter image description here][4]&#xD;
&#xD;
There you can choose a frequency for the measurements and start the streaming. In fact we also can choose which sensors we want to use with the Config button. &#xD;
&#xD;
![enter image description here][5]&#xD;
&#xD;
The following Mathematica code will work when all (!) sensors are switched on. Now we are ready to connect to the iPhone. Switch the streaming on and execute the following commands:&#xD;
&#xD;
    ClearAll[&amp;#034;Global`*&amp;#034;];&#xD;
    For[i = 1, i &amp;lt; 3, i++, Quiet[InstallJava[]]];&#xD;
    Needs[&amp;#034;JLink`&amp;#034;]&#xD;
&#xD;
and then &#xD;
&#xD;
    LoadJavaClass[&amp;#034;java.util.Arrays&amp;#034;];&#xD;
    packet = JavaNew[&amp;#034;java.net.DatagramPacket&amp;#034;, JavaNew[&amp;#034;[B&amp;#034;, 1024], 1024];&#xD;
    socket = JavaNew[&amp;#034;java.net.DatagramSocket&amp;#034;, 10552];&#xD;
    socket@setSoTimeout[10];&#xD;
    listen[] := If[$Failed =!= Quiet[socket@receive[packet], Java::excptn], &#xD;
    record =JavaNew[&amp;#034;java.lang.String&amp;#034;, java`util`Arrays`copyOfRange @@ &#xD;
    packet /@ {getData[], getOffset[], getLength[]}]@toString[] //&#xD;
    Sow];&#xD;
&#xD;
Next we have to define a ScheduledTask to read the sensors:&#xD;
&#xD;
    RemoveScheduledTask[ScheduledTasks[]];&#xD;
    results = {}; &#xD;
    RunScheduledTask[AppendTo[results, Quiet[Reap[listen[]][[2, 1]]]]; If[Length[results] &amp;gt; 1200, Drop[results, 150]], 0.01];&#xD;
&#xD;
We also need to define a streaming function:&#xD;
&#xD;
    stream := Refresh[ToExpression[StringSplit[#[[1]], &amp;#034;,&amp;#034;]] &amp;amp; /@ Select[results[[-1000 ;;]], Head[#] == List &amp;amp;], UpdateInterval -&amp;gt; 0.01]&#xD;
&#xD;
Alright. Now comes the interesting part. Using &#xD;
&#xD;
    (*Compass*)&#xD;
    While[Length[results] &amp;lt; 1000, Pause[2]]; Dynamic[AngularGauge[Refresh[stream[[-1, 30]], UpdateInterval -&amp;gt; 0.01], {360, 0}, &#xD;
    ScaleDivisions -&amp;gt; None, GaugeLabels -&amp;gt; {Placed[&amp;#034;N&amp;#034;, Top], Placed[&amp;#034;S&amp;#034;, Bottom], Placed[&amp;#034;E&amp;#034;, Right], Placed[&amp;#034;W&amp;#034;, Left]}, ScaleOrigin -&amp;gt; {{5 Pi/2, Pi/2}, 1}, ScalePadding -&amp;gt; All, ImageSize -&amp;gt; Medium], SynchronousUpdating -&amp;gt; False]&#xD;
&#xD;
we can measure the bearing of our iPhone. The resulting compass moves as we move the iPhone:&#xD;
&#xD;
![enter image description here][6]&#xD;
&#xD;
We can also read the (x-,y-,z-) accelerometers&#xD;
&#xD;
    (*Plot Ascelerometers*)&#xD;
    While[Length[results] &amp;lt; 1000, Pause[2]]; Dynamic[Refresh[ListLinePlot[{stream[[All, 2]], stream[[All, 3]], stream[[All, 4]]}, PlotRange -&amp;gt; All], UpdateInterval -&amp;gt; 0.1]]&#xD;
&#xD;
which gives plots like this one:&#xD;
&#xD;
![enter image description here][7]&#xD;
&#xD;
The update is a bit bumpy, because the data is only sent every second or so from the iPhone; the measurements, however, are taken with a frequency of up to 100Hz. We can also represent the FFT of the streamed data like so:&#xD;
&#xD;
    (*Plot FFT of accelorometers*)&#xD;
    While[Length[results] &amp;lt; 1000, &#xD;
     Pause[2]]; Dynamic[&#xD;
     Refresh[ListLinePlot[&#xD;
       Log /@ {Abs[Fourier[Standardize[stream[[All, 2]]]]], &#xD;
         Abs[Fourier[Standardize[stream[[All, 3]]]]], &#xD;
         Abs[Fourier[Standardize[stream[[All, 4]]]]]}, &#xD;
       PlotRange -&amp;gt; {{0, 200}, {-5, 2.5}}, ImageSize -&amp;gt; Large], &#xD;
      UpdateInterval -&amp;gt; 0.1]]&#xD;
&#xD;
Adding a &amp;#034;real time&amp;#034; scale is also quite straight forward:&#xD;
&#xD;
(*Measurements with time scale*)&#xD;
&#xD;
    While[Length[results] &amp;lt; 1000, Pause[2]];&#xD;
    starttime = IntegerPart[stream[[2, 1]]];&#xD;
    Dynamic[Refresh[&#xD;
      ListLinePlot[&#xD;
       Transpose[{(stream[[Max[-300, -Length[stream]] ;;, 1]] - &#xD;
           starttime), stream[[Max[-300, -Length[stream]] ;;, 2]]}], &#xD;
       PlotRange -&amp;gt; All, ImageSize -&amp;gt; Large], UpdateInterval -&amp;gt; 0.01]]&#xD;
&#xD;
Well, then. We can also plot our iPhone&amp;#039;s position in space&#xD;
&#xD;
    (*3d Motion*)&#xD;
    &#xD;
    While[Length[results] &amp;lt; 1000, Pause[2]]; Dynamic[&#xD;
     Refresh[ListLinePlot[{stream[[All, 5]], stream[[All, 6]], &#xD;
        stream[[All, 7]]}, PlotRange -&amp;gt; All], UpdateInterval -&amp;gt; 0.1]]&#xD;
    &#xD;
    While[Length[results] &amp;lt; 1000, Pause[2]]; Dynamic[&#xD;
     Graphics3D[{Black, &#xD;
       Rotate[Rotate[&#xD;
         Rotate[Cuboid[{-2, -1, -0.2}, {2, 1, 0.2}], &#xD;
          stream[[-1, 7]], {0, 0, 1}], -1*stream[[-1, 6]], {0, 1, 0}], &#xD;
        stream[[-1, 5]], {1, 0, 0}]}, &#xD;
      PlotRange -&amp;gt; {{-3, 3}, {-3, 3}, {-3, 3}}, Boxed -&amp;gt; True], &#xD;
     UpdateInterval -&amp;gt; 0.1, SynchronousUpdating -&amp;gt; False]&#xD;
&#xD;
This looks like so:&#xD;
&#xD;
![enter image description here][8]&#xD;
&#xD;
Last but not least we can write a little GUI to access all different sensors. (This does run a bit slow though!)&#xD;
&#xD;
(*GUI all sensors*)&#xD;
&#xD;
    sensororder = {&amp;#034;Timestamp&amp;#034;, &amp;#034;Accel_X&amp;#034;, &amp;#034;Accel_Y&amp;#034;, &amp;#034;Accel_Z&amp;#034;, &amp;#034;Roll&amp;#034;, &#xD;
       &amp;#034;Pitch&amp;#034;, &amp;#034;Yaw&amp;#034;, &amp;#034;Quat.X&amp;#034;, &amp;#034;Quat.Y&amp;#034;, &amp;#034;Quat.Z&amp;#034;, &amp;#034;Quat.W&amp;#034;, &amp;#034;RM11&amp;#034;, &#xD;
       &amp;#034;RM12&amp;#034;, &amp;#034;RM13&amp;#034;, &amp;#034;RM21&amp;#034;, &amp;#034;RM22&amp;#034;, &amp;#034;RM23&amp;#034;, &amp;#034;RM31&amp;#034;, &amp;#034;RM32&amp;#034;, &amp;#034;RM33&amp;#034;, &#xD;
       &amp;#034;GravAcc_X&amp;#034;, &amp;#034;GravAcc_Y&amp;#034;, &amp;#034;GravAcc_Z&amp;#034;, &amp;#034;UserAcc_X&amp;#034;, &amp;#034;UserAcc_Y&amp;#034;, &#xD;
       &amp;#034;UserAcc_Z&amp;#034;, &amp;#034;RotRate_X&amp;#034;, &amp;#034;RotRate_Y&amp;#034;, &amp;#034;RotRate_Z&amp;#034;, &amp;#034;MagHeading&amp;#034;, &#xD;
       &amp;#034;TrueHeading&amp;#034;, &amp;#034;HeadingAccuracy&amp;#034;, &amp;#034;MagX&amp;#034;, &amp;#034;MagY&amp;#034;, &amp;#034;MagZ&amp;#034;, &amp;#034;Lat&amp;#034;, &#xD;
       &amp;#034;Long&amp;#034;, &amp;#034;LocAccuracy&amp;#034;, &amp;#034;Course&amp;#034;, &amp;#034;Speed&amp;#034;, &amp;#034;Altitude&amp;#034;, &#xD;
       &amp;#034;Proximity&amp;#034;};&#xD;
    While[Length[results] &amp;lt; 1000, Pause[2]]; Manipulate[&#xD;
     Dynamic[Refresh[&#xD;
       ListLinePlot[{stream[[All, Position[sensororder, a][[1, 1]]]], &#xD;
         stream[[All, Position[sensororder, b][[1, 1]]]], &#xD;
         stream[[All, Position[sensororder, c][[1, 1]]]]}, &#xD;
        PlotRange -&amp;gt; All, ImageSize -&amp;gt; Full], &#xD;
       UpdateInterval -&amp;gt; 0.01]], {{a, &amp;#034;Accel_X&amp;#034;}, &#xD;
      sensororder}, {{b, &amp;#034;Accel_Y&amp;#034;}, sensororder}, {{c, &amp;#034;Accel_Z&amp;#034;}, &#xD;
      sensororder}, ControlPlacement -&amp;gt; Left, &#xD;
     SynchronousUpdating -&amp;gt; False]&#xD;
&#xD;
This gives a user interface which looks like this:&#xD;
&#xD;
![enter image description here][9]&#xD;
&#xD;
In the drop down menu we can choose three out of all sensors. These are all available sensors:&#xD;
&#xD;
&amp;gt; &amp;#034;Timestamp&amp;#034;, &amp;#034;Accel_X&amp;#034;, &amp;#034;Accel_Y&amp;#034;, &amp;#034;Accel_Z&amp;#034;, &amp;#034;Roll&amp;#034;, &amp;#034;Pitch&amp;#034;, &amp;#034;Yaw&amp;#034;,&#xD;
&amp;gt; &amp;#034;Quat.X&amp;#034;, &amp;#034;Quat.Y&amp;#034;, &amp;#034;Quat.Z&amp;#034;, &amp;#034;Quat.W&amp;#034;, &amp;#034;RM11&amp;#034;,  &amp;#034;RM12&amp;#034;, &amp;#034;RM13&amp;#034;,&#xD;
&amp;gt; &amp;#034;RM21&amp;#034;, &amp;#034;RM22&amp;#034;, &amp;#034;RM23&amp;#034;, &amp;#034;RM31&amp;#034;, &amp;#034;RM32&amp;#034;, &amp;#034;RM33&amp;#034;, &amp;#034;GravAcc_X&amp;#034;,&#xD;
&amp;gt; &amp;#034;GravAcc_Y&amp;#034;, &amp;#034;GravAcc_Z&amp;#034;, &amp;#034;UserAcc_X&amp;#034;, &amp;#034;UserAcc_Y&amp;#034;,   &amp;#034;UserAcc_Z&amp;#034;,&#xD;
&amp;gt; &amp;#034;RotRate_X&amp;#034;, &amp;#034;RotRate_Y&amp;#034;, &amp;#034;RotRate_Z&amp;#034;, &amp;#034;MagHeading&amp;#034;, &amp;#034;TrueHeading&amp;#034;,&#xD;
&amp;gt; &amp;#034;HeadingAccuracy&amp;#034;, &amp;#034;MagX&amp;#034;, &amp;#034;MagY&amp;#034;, &amp;#034;MagZ&amp;#034;, &amp;#034;Lat&amp;#034;,   &amp;#034;Long&amp;#034;,&#xD;
&amp;gt; &amp;#034;LocAccuracy&amp;#034;, &amp;#034;Course&amp;#034;, &amp;#034;Speed&amp;#034;, &amp;#034;Altitude&amp;#034;, &amp;#034;Proximity&amp;#034;&#xD;
&#xD;
There are certainly any things that can and should be improved. The main problem seems to be that the data, even if sampled at 100Hz, is sent to the iPhone only every second or so. So it is not really real time. I hope that someone who is better at iPhone programming than I am - I am really rubbish at it- could help and write an iPhone program to stream the data in a more convenient way: one by one rather than in packets. &#xD;
&#xD;
There are many potential applications for this. Here are some I could come up with:&#xD;
&#xD;
 1. You can carry the iPhone around and measure your movements (acceleration). Attached to your hand you can measure your tremor. &#xD;
 2. The magnetometer is really cool. You can use it to find metal bars in the walls an also electric cables. &#xD;
 3. You can collect GPS data for all sorts of applications; there are ideas to use this for the detection of certain diseases. For example if it takes you longer than usual to find your car when you come from shopping that might hint at early stages of dementia --- or sleep deprivation.&#xD;
 4. When you put the phone on a machine, like a running motor, you can measure the vibrations. When you perform a frequency analysis you can check whether the motor runs alright.&#xD;
 5. Using the accelerometers I was able to measure my breathing (putting the phone on my chest).&#xD;
 &#xD;
I think that there might also be quite some potential for using the Wolfram Cloud here. Deploying a program in the cloud and reading from your phone is certainly quite interesting. The problem is that this particular app only works via WiFi. It would be nice to have one that works via 3G. &#xD;
&#xD;
So, in summary, it might be quite useful to use the iPhone&amp;#039;s sensors. The advantage is that nearly everyone carries a smartphone with them all the time. Making more of your smart phone&amp;#039;s sensors with Mathematica seems to be a nice playground for applications. I&amp;#039;d love to hear about your ideas...&#xD;
&#xD;
Cheers,&#xD;
&#xD;
Marco&#xD;
&#xD;
PS: When you are done with the streaming you should execute these commands:&#xD;
&#xD;
    (*Remove Scheduled Tasks and close link*)&#xD;
    RemoveScheduledTask[ScheduledTasks[]]; socket@close[];&#xD;
&#xD;
&#xD;
  [1]: http://community.wolfram.com/web/bschelter&#xD;
  [2]: https://itunes.apple.com/gb/app/sensor-data/id397619802?mt=8&#xD;
  [3]: /c/portal/getImageAttachment?filename=sensorwelcome.PNG&amp;amp;userId=48754&#xD;
  [4]: /c/portal/getImageAttachment?filename=sensorstreaming.PNG&amp;amp;userId=48754&#xD;
  [5]: /c/portal/getImageAttachment?filename=Allsensors.PNG&amp;amp;userId=48754&#xD;
  [6]: /c/portal/getImageAttachment?filename=Compass.gif&amp;amp;userId=48754&#xD;
  [7]: /c/portal/getImageAttachment?filename=Accelerometer.gif&amp;amp;userId=48754&#xD;
  [8]: /c/portal/getImageAttachment?filename=Iphonemovement.gif&amp;amp;userId=48754&#xD;
  [9]: /c/portal/getImageAttachment?filename=ScreenShot2014-09-15at23.52.46.png&amp;amp;userId=48754</description>
    <dc:creator>Marco Thiel</dc:creator>
    <dc:date>2014-09-15T23:53:14Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/1028536">
    <title>Mathematica 11.0.1 now available for the Raspberry Pi</title>
    <link>https://community.wolfram.com/groups/-/m/t/1028536</link>
    <description>Hi all,&#xD;
&#xD;
Mathematica 11.0.1 is now available for the Raspberry Pi on Raspbian. If you already have Mathematica installed on your Raspberry Pi, you can update with the following:&#xD;
&#xD;
    sudo apt-get update &amp;amp;&amp;amp; sudo apt-get upgrade wolfram-engine&#xD;
&#xD;
If you don&amp;#039;t already have Mathematica installed you can run the following commands to install it:&#xD;
&#xD;
    sudo apt-get update &amp;amp;&amp;amp; sudo apt-get install wolfram-engine&#xD;
&#xD;
New features for the Raspberry Pi include :&#xD;
&#xD;
 - Neural Network features including constructing custom nets : http://reference.wolfram.com/language/guide/NeuralNetworks.html&#xD;
 - Audio processing features including out of core streaming of large sounds as well as advanced audio processing : http://reference.wolfram.com/language/guide/AudioProcessing.html&#xD;
 - Travel based path plan functions including path finding from one city to another : http://reference.wolfram.com/language/guide/LocationsPathsAndRouting.html&#xD;
 - Channel based communication for sending and receiving messages : http://reference.wolfram.com/language/guide/Channel-BasedCommunication.html&#xD;
 - Powerful and easy scripting through WolframScript : http://reference.wolfram.com/language/ref/program/wolframscript.html&#xD;
 - And many more : http://reference.wolfram.com/language/guide/SummaryOfNewFeaturesIn11.html&#xD;
&#xD;
Additionally, with the new release of WolframScript on the Raspberry Pi, you can install WolframScript standalone and run it without a local kernel against the cloud using the `-cloud` option. This means you can use the Wolfram Language through WolframScript on the Raspberry Pi without having wolfram-engine installed by running it against the cloud. See the documentation page for WolframScript for more details.</description>
    <dc:creator>Ian Johnson</dc:creator>
    <dc:date>2017-03-09T21:02:49Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/1057588">
    <title>Parallel Mathematica Environment on the RaspberryPi using OOP</title>
    <link>https://community.wolfram.com/groups/-/m/t/1057588</link>
    <description>My project, Parallel Mathematica Environment on the RaspberryPi using OOP, is a sample application of **Object Oriented Programming for the Mathematica** cluster computing, implemented with a Mac and three RaspberryPi Zero connected with a USB hub and three USB cables.&#xD;
&#xD;
Basic idea is to deploy a constructed instance image to calculating servers (RaspberryPi) and send messages to the instance. [OOP on the Mathematica is already developed and shown][1] in this community, and further detail is shown on [slidesshare][2] titled of &amp;#034;OOP for Mathematica.&amp;#034;&#xD;
![enter image description here][3]&#xD;
&#xD;
&#xD;
----------&#xD;
&#xD;
&#xD;
&#xD;
Preparing for RaspberryPi Zero is as follows using SSH connection from a Mac, &#xD;
&#xD;
 - naming each Zero as raspberypi,raspberrypi1,raspberrypi2,...&#xD;
 - set the server program &amp;#034;init&amp;#034; to each RaspberryPi, init is,&#xD;
&#xD;
        $ cat init&#xD;
        While[True,&#xD;
        Run[nc -l 8000&amp;gt;input];&#xD;
        temp=ReleaseHold[&amp;lt;&amp;lt;input];&#xD;
        temp &amp;gt;&amp;gt;output;&#xD;
        Run[nc your-mac-hostname.local 8002&amp;lt;output]&#xD;
        ]&#xD;
        &#xD;
&#xD;
where, socket numbers must be identical.&#xD;
&#xD;
 - Run Mathematica manually, and wait the booting Mathematica up.&#xD;
&#xD;
        $ wolfram &amp;lt;init&amp;amp;&#xD;
&#xD;
Checking each RaspberryPi is useful as,&#xD;
&#xD;
    $ nc -l 8002 &amp;gt;output|nc raspberrypi.local 8000 &amp;lt;&amp;lt;EOF&#xD;
    &amp;gt; 10!&#xD;
    &amp;gt; EOF&#xD;
    $ cat output&#xD;
    3628800&#xD;
&#xD;
&#xD;
----------&#xD;
&#xD;
Cluster controller program on a Mac is,&#xD;
&#xD;
 - set directory&#xD;
&#xD;
        SetDirectory[NotebookDirectory[]];&#xD;
&#xD;
 - setup socket communication process&#xD;
&#xD;
        com1=&amp;#034;nc -l 8002 &amp;gt;output1 |nc raspberrypi.local 8000 &amp;lt;input1&amp;#034;;&#xD;
        com2=&amp;#034;nc -l 9002 &amp;gt;output2 |nc raspberrypi1.local 9000 &amp;lt;input2&amp;#034;;&#xD;
        com3=&amp;#034;nc -l 9502 &amp;gt;output3 |nc raspberrypi2.local 9500 &amp;lt;input3&amp;#034;;&#xD;
&#xD;
 - set object property&#xD;
&#xD;
        obj={&#xD;
           &amp;lt;|&amp;#034;name&amp;#034;-&amp;gt;node1,&amp;#034;comm&amp;#034;-&amp;gt;com1,&amp;#034;in&amp;#034;-&amp;gt;&amp;#034;input1&amp;#034;,&amp;#034;out&amp;#034;-&amp;gt;&amp;#034;output1&amp;#034;,&amp;#034;p&amp;#034;-&amp;gt;{2000,3500}|&amp;gt;,&#xD;
           &amp;lt;|&amp;#034;name&amp;#034;-&amp;gt;node2,&amp;#034;comm&amp;#034;-&amp;gt;com2,&amp;#034;in&amp;#034;-&amp;gt;&amp;#034;input2&amp;#034;,&amp;#034;out&amp;#034;-&amp;gt;&amp;#034;output2&amp;#034;,&amp;#034;p&amp;#034;-&amp;gt;{3501,4000}|&amp;gt;,&#xD;
           &amp;lt;|&amp;#034;name&amp;#034;-&amp;gt;node3,&amp;#034;comm&amp;#034;-&amp;gt;com3,&amp;#034;in&amp;#034;-&amp;gt;&amp;#034;input3&amp;#034;,&amp;#034;out&amp;#034;-&amp;gt;&amp;#034;output3&amp;#034;,&amp;#034;p&amp;#034;-&amp;gt;{4000,4500}|&amp;gt;};&#xD;
&#xD;
 - define calculation server class, where is a sample Mersenne prime number calculation&#xD;
&#xD;
        new[nam_]:=Module[{ps,pe},&#xD;
           mersenneQ[n_]:=PrimeQ[2^n-1];&#xD;
           setv[nam[{s_,e_}]]^:={ps,pe}={s,e};&#xD;
           calc[nam]^:=Select[Range[ps,pe],mersenneQ]&#xD;
           ];&#xD;
&#xD;
 - construct instances&#xD;
&#xD;
        Map[new[#name]&amp;amp;,obj];&#xD;
&#xD;
 - deploy instances to calculation servers&#xD;
&#xD;
        Map[Save[#in,#name]&amp;amp;,obj];&#xD;
        Map[Run[#comm]&amp;amp;,obj];&#xD;
&#xD;
 - send message to each instance&#xD;
&#xD;
        Map[Put[Hold@setv[#name[#p]],#in]&amp;amp;,obj];&#xD;
        Map[Run[#comm]&amp;amp;,obj];&#xD;
&#xD;
 - start calculation&#xD;
&#xD;
        Map[Put[Hold@calc[#name],#in]&amp;amp;,obj];&#xD;
        proc=Map[StartProcess[{$SystemShell,&amp;#034;-c&amp;#034;,#comm}]&amp;amp;,obj]&#xD;
&#xD;
 - wait for the process termination (mannualy in this sample code)&#xD;
&#xD;
        Map[ProcessStatus[#]&amp;amp;,proc]&#xD;
         {Finished,Finished,Finished}&#xD;
&#xD;
 - gather the results&#xD;
&#xD;
        Map[FilePrint[#out]&amp;amp;,obj];&#xD;
         {2203, 2281, 3217}&#xD;
        {}&#xD;
        {4253, 4423}&#xD;
&#xD;
&#xD;
  [1]: http://community.wolfram.com/groups/-/m/t/897081?p_p_auth=o5qxZhNR&#xD;
  [2]: https://www.slideshare.net/kobayashikorio/oop-for-mathematica&#xD;
  [3]: http://community.wolfram.com//c/portal/getImageAttachment?filename=2017-04-10.jpg&amp;amp;userId=897049</description>
    <dc:creator>Hirokazu Kobayashi</dc:creator>
    <dc:date>2017-04-10T01:15:22Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/196759">
    <title>Reading Temperature Sensors in the Wolfram Language on the RPi</title>
    <link>https://community.wolfram.com/groups/-/m/t/196759</link>
    <description>These sensors are pretty cool--they are [url=http://www.adafruit.com/products/374]cheap to buy[/url] and surprisingly sensitive to small changes in temperature. Here&amp;#039;s a first attempt I made to interact with the sensors in the Wolfram Language.&#xD;
&#xD;
For this setup I used DS18B20 temperature sensors and hooked them up to the Raspberry Pi breadboard according to Adafruit&amp;#039;s [url=http://learn.adafruit.com/adafruits-raspberry-pi-lesson-11-ds18b20-temperature-sensing/overview]setup guide[/url]. The board should look like the following diagram (make sure the sensor is hooked up to a 3.3V pin--not a 5V pin or you could fry the sensor):&#xD;
&#xD;
[center][img=width: 300px; height: 481px;]https://learn.adafruit.com/system/assets/assets/000/003/775/medium800/learn_raspberry_pi_summary.jpg[/img][/center]&#xD;
Once hooked up and connected to your Pi, run the following commands in the terminal:&#xD;
&#xD;
[code]sudo modprobe w1-gpio&#xD;
sudo modprobe w1-therm[/code]&#xD;
The temperatures are read from the sensor by &amp;#034;reading&amp;#034; the file that&amp;#039;s created in the devices directory. You can locate the file with the following commands:&#xD;
&#xD;
[code]cd /sys/bus/w1/devices&#xD;
ls[/code]&#xD;
This will show you the contents of your devices folder, where there should be a file titled 28-xxxx, where the xxxx is the serial number unique to your sensor. Once you&amp;#039;ve got that number, enter:&#xD;
&#xD;
[code]cd 28-xxxx (the xxxx should be replaced with the serial number unique to your sensor)&#xD;
cat w1_slave[/code]&#xD;
Two lines of data should return back to you--if the first line ends with &amp;#034;YES&amp;#034; then the 5-digit number at the end of the second line is the temperature, to be read as xx.xxx degrees Celsius.&#xD;
&#xD;
And now that we know that the temperature sensor is working, and we know how to find it, we can copy the file path and import it using the Wolfram Language.&#xD;
&#xD;
[mcode]Import[&amp;#034;/sys/bus/w1/devices/28-000004fe0343/w1_slave&amp;#034;][/mcode]&#xD;
Since this still returns a really long string of data that we don&amp;#039;t need, we can single out the temperature and then convert the string into a computable expression.&#xD;
&#xD;
[mcode]temp:=N[ToExpression[StringTake[Import[&amp;#034;/sys/bus/w1/devices/28-000004fe0343/w1_slave&amp;#034;],-5]]/1000][/mcode]&#xD;
So now when we read the file, we just get the temperature back!&#xD;
&#xD;
[mcode]temp&#xD;
(*22.312*)&#xD;
[/mcode]&#xD;
For kicks, I set up a scheduled task to plot the ambient temperature of my office every 60 seconds for 6 hours. Unsurprisingly, the temperature only fluctuated a few tenths of a degree...!&#xD;
&#xD;
[mcode]t={}&#xD;
RunScheduledTask[(deg=temp;AppendTo[t,deg]),{60,360}];&#xD;
Dynamic[ListLinePlot[t,Joined-&amp;gt;True,PlotRange-&amp;gt;Automatic]]&#xD;
[/mcode]&#xD;
And here&amp;#039;s what the graph looked like after a little bit of time--it truly is a sensitive device (the &amp;#034;large&amp;#034; dip down to 22.1 was me touching the sensor with my cold hands!):&#xD;
&#xD;
[center][img=width: 360px; height: 228px;]/c/portal/getImageAttachment?filename=temperaturereading3.jpg&amp;amp;userId=108162[/img][/center]Any suggestions for what to do next?</description>
    <dc:creator>Allison Taylor</dc:creator>
    <dc:date>2014-02-06T21:04:23Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/344241">
    <title>Reading high resolution weather data from Netatmo</title>
    <link>https://community.wolfram.com/groups/-/m/t/344241</link>
    <description>Together with [Björn Schelter][1] I have tried to read in data from the personal weather station [Netatmo][2] &#xD;
&#xD;
![enter image description here][3]&#xD;
&#xD;
which, as it turns out, is a very good companion for Mathematica; it also features in the [connected devices list of Wolfram][4].This device measures temperature, humidity, pressure, noise level and potentially the precipitation indoors and outdoors. Users are encouraged to share the outdoors data; as the weather station is rather popular there are lots of measurements. On their website [https://www.netatmo.com/][5] the company makes these measurements available. You can represent worldwide data &#xD;
&#xD;
![enter image description here][6]&#xD;
&#xD;
(I know that that figure does not show the entire world!) or zoom in to street level data:&#xD;
&#xD;
![enter image description here][7]&#xD;
&#xD;
On the website [https://dev.netatmo.com][8] you can sign up for a developer account which gives you access to the API of netatmo. In this post I am going to show how to access the data with Mathematica. When you sign up for a netatmo developer account you will be issued a client id and a client secret. These are rather long strings. You will also get a username and a password for your account. Next you need to request an access token, which you can do via the following command:&#xD;
&#xD;
&amp;gt; curl -X POST -d &#xD;
&amp;gt; &amp;#034;grant_type=password&amp;amp;client_id=AAAAAAAAAAAAAAAAAAAA&amp;amp;client_secret=BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBB&#xD;
&amp;gt; &amp;amp;username=XXXXXXXXXXXXXXXX&amp;amp;password=YYYYYYYYYYYYYYY&amp;amp;scope=read_station&amp;#034;&#xD;
&amp;gt; http://api.netatmo.net/oauth2/token&amp;gt; ~/Desktop/request-token.txt&#xD;
&#xD;
On a Mac I generate a file called netatmo.sh containing that string on the desktop; I obviously substitute AAAAAAAAAAAA by the client id, BBBBBBBBBBBBBBBB by the client secret, XXXXXXXXXXXX and YYYYYYYYYY by the user name and the password. Then I use the terminal command &#xD;
&#xD;
&amp;gt; chmod a+x netatmo.sh&#xD;
&#xD;
The rest is child&amp;#039;s play. We need to execute the command &#xD;
&#xD;
    Run[&amp;#034;~/Desktop/netatmo.sh&amp;#034;];&#xD;
    data = Import[&amp;#034;https://api.netatmo.net/api/getpublicdata?access_token=&amp;#034;&amp;lt;&amp;gt;Last[StringSplit[Import[&amp;#034;~/Desktop/request-token.txt&amp;#034;, &amp;#034;CSV&amp;#034;][[1, 1]], &amp;#034;\&amp;#034;&amp;#034;]] &amp;lt;&amp;gt; &#xD;
        &amp;#034;&amp;amp;lat_ne=59.91&amp;amp;lon_ne=13.75&amp;amp;lat_sw=40.42&amp;amp;lon_sw=-20.0&amp;amp;filter=True&amp;#034;, &amp;#034;Text&amp;#034;];&#xD;
&#xD;
Note that the numbers following lat_ne, lon_ne, lat_sw, lon_sw are the north east and south west latitudes and longitudes. If we want to request data for other regions we can do so by changing these entries. Next we clean the data a little bit:&#xD;
&#xD;
    tab = Quiet[&#xD;
    Select[Select[Table[ToExpression /@ Flatten[StringSplit[#, &amp;#034;]&amp;#034;] &amp;amp; /@ StringSplit[#, &amp;#034;[&amp;#034;] &amp;amp; /@ &#xD;
    If[Length[StringSplit[StringSplit[data, &amp;#034;place&amp;#034;][[k]], &amp;#034;,&amp;#034;]] &amp;gt; 12, Drop[StringSplit[StringSplit[data, &amp;#034;place&amp;#034;][[k]],&amp;#034;,&amp;#034;], {5}], &#xD;
    StringSplit[StringSplit[data, &amp;#034;place&amp;#034;][[k]], &amp;#034;,&amp;#034;]]][[{2, 3, 7, 8, 15}]], {k, 2, Length[StringSplit[data, &amp;#034;place&amp;#034;]]}], Length[Cases[Flatten[#], $Failed]] == 0 &amp;amp;  ], Length[#] == 5 &amp;amp;]];&#xD;
&#xD;
That does look a bit cryptic but gives us what we want. &#xD;
&#xD;
    tab[[1]]&#xD;
&#xD;
gives {10.5673, 59.8929, 13, 80, 1032.9}, wich are the gps coordinates, the temperature in Celsius, the humidity in % and the pressure in mbar. I will now propose three different representations of the data.&#xD;
&#xD;
    scaled = Rescale[tab[[All, 3]]]; &#xD;
    GeoGraphics[Table[{GeoStyling[Opacity[0.99], RGBColor[scaled[[k]], 1 - scaled[[k]], 0]], GeoDisk[{tab[[k, 2]], tab[[k, 1]]}, Quantity[20, &amp;#034;Kilometers&amp;#034;] ]}, {k,1, Length[tab]}]]&#xD;
&#xD;
which gives:&#xD;
&#xD;
![enter image description here][9]&#xD;
&#xD;
The second representation is calculated using &#xD;
&#xD;
    GeoRegionValuePlot[GeoPosition[{#[[2]], #[[1]]}] -&amp;gt; #[[3]] &amp;amp; /@ tab, PlotRange -&amp;gt; {0, 30}, ColorFunction -&amp;gt; &amp;#034;TemperatureMap&amp;#034;, ImageSize -&amp;gt; Full]&#xD;
&#xD;
which looks like this&#xD;
&#xD;
![enter image description here][10]&#xD;
&#xD;
Finally, the lengthy sequence of commands&#xD;
&#xD;
    surface = Interpolation[{{#[[1]], #[[2]]}, #[[3]]} &amp;amp; /@ tab, InterpolationOrder -&amp;gt; 1];&#xD;
    cPlot = Quiet[ContourPlot[surface[x, y], {x, Min[tab[[All, 1]]], Max[tab[[All, 1]]]}, {y, Min[tab[[All, 2]]], Max[tab[[All, 2]]]}, ImagePadding -&amp;gt; None, &#xD;
    ClippingStyle -&amp;gt; None, Frame -&amp;gt; None, Contours -&amp;gt; 60, ContourLines -&amp;gt; False, PlotRange -&amp;gt; {0, 30}, ColorFunction -&amp;gt; &amp;#034;TemperatureMap&amp;#034;]];&#xD;
    multipoly = Polygon[GeoPosition[Join @@ (EntityValue[EntityClass[&amp;#034;Country&amp;#034;, &amp;#034;Europe&amp;#034;], &amp;#034;Polygon&amp;#034;] /. Polygon[GeoPosition[x_]] :&amp;gt; x)]];&#xD;
    GeoGraphics[{GeoStyling[{&amp;#034;GeoImage&amp;#034;, cPlot}], multipoly, Black, Opacity[1]}, ImageSize -&amp;gt; Full]&#xD;
&#xD;
gives this representation&#xD;
&#xD;
![enter image description here][11]&#xD;
&#xD;
I am quite sure that with some modifications one can make a useful program out of this if one uses cloud deploy. Also, netatmo&amp;#039;s data are updated every 30 minutes (every 5 minutes on the individual devices), so one can run a scheduled task and look at the development of the temperature. The large number of netatmo weather stations complements the data available from the Wolfram Data servers very nicely as they provide very up to date street level data.&#xD;
&#xD;
I would be glad to see a good idea of a cloud deployed service based on this or any other ideas that you might have.&#xD;
&#xD;
Cheers,&#xD;
&#xD;
Marco&#xD;
&#xD;
&#xD;
  [1]: http://community.wolfram.com/web/bschelter&#xD;
  [2]: https://www.netatmo.com/en-US/product/weather-station&#xD;
  [3]: /c/portal/getImageAttachment?filename=ScreenShot2014-09-15at22.55.05.png&amp;amp;userId=48754&#xD;
  [4]: http://devices.wolfram.com/devices/netatmo-weather-station.html&#xD;
  [5]: https://www.netatmo.com/weathermap&#xD;
  [6]: /c/portal/getImageAttachment?filename=ScreenShot2014-09-15at22.59.02.png&amp;amp;userId=48754&#xD;
  [7]: /c/portal/getImageAttachment?filename=ScreenShot2014-09-15at22.59.37.png&amp;amp;userId=48754&#xD;
  [8]: https://dev.netatmo.com&#xD;
  [9]: /c/portal/getImageAttachment?filename=netatmofig1.gif&amp;amp;userId=48754&#xD;
  [10]: /c/portal/getImageAttachment?filename=Netatmofig2.gif&amp;amp;userId=48754&#xD;
  [11]: /c/portal/getImageAttachment?filename=Netatmofig3.gif&amp;amp;userId=48754</description>
    <dc:creator>Marco Thiel</dc:creator>
    <dc:date>2014-09-15T22:35:11Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/456947">
    <title>How to Make a Time Lapse Video With Your Raspberry Pi and Data Drop</title>
    <link>https://community.wolfram.com/groups/-/m/t/456947</link>
    <description>![Wolfram Pi Flowers][9]&#xD;
&#xD;
&#xD;
----------&#xD;
&#xD;
I will explain how to make the time-lapse animation you can see above.&#xD;
&#xD;
**1)** Set-up your [camera module][1]. I sticked mine on a hard drive, see the first image on [my previous Data Drop post][2].&#xD;
&#xD;
![setup][3]&#xD;
&#xD;
**2)** Take a test shot to check that the exposure is acceptable.&#xD;
&#xD;
    DeviceRead[&amp;#034;RaspiCam&amp;#034;,{320, 240}]&#xD;
&#xD;
**3)** Adjust the resulting image with [ImageAdjust][4].&#xD;
&#xD;
    ImageAdjust[DeviceRead[&amp;#034;RaspiCam&amp;#034;,{320, 240}]]&#xD;
![Test shot][5]&#xD;
&#xD;
**4)** Create a new databin, and take note of its short ID:&#xD;
&#xD;
    CloudConnect[&amp;#034;email-wolframID&amp;#034;,&amp;#034;password&amp;#034; ];&#xD;
    bin=CreateDatabin[];&#xD;
    bin[&amp;#034;ShortID&amp;#034;]&#xD;
&amp;#034;3GgU-jf4&amp;#034;&#xD;
&#xD;
**5)** Setup a [ScheduledTask][6] that adds a snapshot to your databin every 360 seconds (6minutes):&#xD;
&#xD;
    intervalometer=RunScheduledTask[DatabinAdd[Databin[&amp;#034;3GgU-jf4&amp;#034;], ImageAdjust[DeviceRead[&amp;#034;RaspiCam&amp;#034;,{320, 240}]]],360]&#xD;
&#xD;
**6)** Water your plant and wait.&#xD;
&#xD;
**7)** Check that your databin is being filled correctly [http://wolfr.am/3GgU-jf4][7]&#xD;
&#xD;
![databin][8]&#xD;
&#xD;
**8)** Compile the animated gif:&#xD;
&#xD;
    frames = Values[Databin[&amp;#034;3GgU-jf4&amp;#034;]]; &#xD;
    Export[&amp;#034;resurrected_plant.gif&amp;#034;, Join[frames, Reverse[frames]]]&#xD;
&#xD;
**9)** Enjoy!&#xD;
&#xD;
![Wolfram Pi Flowers][9]&#xD;
&#xD;
**10)** To stop your scheduled task, use the function [StopScheduledTask][10]:&#xD;
&#xD;
    StopScheduledTask[intervalometer]&#xD;
&#xD;
&#xD;
  [1]: http://community.wolfram.com/groups/-/m/t/157704&#xD;
  [2]: http://community.wolfram.com/groups/-/m/t/453169&#xD;
  [3]: /c/portal/getImageAttachment?filename=setupPlant.png&amp;amp;userId=56204&#xD;
  [4]: http://reference.wolfram.com/language/ref/ImageAdjust.html&#xD;
  [5]: /c/portal/getImageAttachment?filename=FlowerFrames.jpg&amp;amp;userId=56204&#xD;
  [6]: http://reference.wolfram.com/language/ref/RunScheduledTask.html&#xD;
  [7]: http://wolfr.am/3GgU-jf4&#xD;
  [8]: /c/portal/getImageAttachment?filename=databin_filled.png&amp;amp;userId=56204&#xD;
  [9]: /c/portal/getImageAttachment?filename=Wplant.gif&amp;amp;userId=56204&#xD;
  [10]: http://reference.wolfram.com/language/ref/StopScheduledTask.html</description>
    <dc:creator>Bernat Espigulé</dc:creator>
    <dc:date>2015-03-11T11:50:05Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/23261">
    <title>How do I connect Arduino to Mathematica kernel?</title>
    <link>https://community.wolfram.com/groups/-/m/t/23261</link>
    <description>I&amp;#039;ve recently started having fun with Arduino (see [b][url=http://www.arduino.cc/]www.arduino.com[/url][/b] for details).  The memory and processing speed is far below what is necessary to run the Mathematica kernel, but I wonder if an Arduino program might somehow make calls to a Mathematica kernel (running on a network-connected machine) and get results back.  Has anyone done this?</description>
    <dc:creator>David DeBrota</dc:creator>
    <dc:date>2012-10-19T12:17:10Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/181641">
    <title>An experiment in Moment of Inertia with Raspberry Pi / Arduino</title>
    <link>https://community.wolfram.com/groups/-/m/t/181641</link>
    <description>Wanted to share with my kids the classical experiment on a rolling object down an incline plane using legos, arduino and Mathematica for the Raspberry Pi.

Using a window valance and some lego technic pieces we proceeded to build the inclined plane platform.

[img=width: 300px; height: 400px;]/c/portal/getImageAttachment?filename=inclinedplane1.JPG&amp;amp;userId=78214[/img]

Using legos was great to allow to attach both leds and photorresistors to the inclined plane. It also allowed us to set up build a platform from where to attach a servo motor and a protractor to measure the angle of inclination.
[img=width: 400px; height: 533px;]/c/portal/getImageAttachment?filename=inclinedplane2.JPG&amp;amp;userId=78214[/img]

We&amp;#039;ll detect when the ball reaches a certain point of the platform by the the disminution of the LED light received at the photoresistors.
[img=width: 300px; height: 400px;]/c/portal/getImageAttachment?filename=8270photo(1).JPG&amp;amp;userId=78214[/img]

The adafruit web site is a very good site to explain the details on how to connect the photoresistors and leds to the Arduino.
[url=http://learn.adafruit.com/adafruit-arduino-lesson-2-leds/leds]http://learn.adafruit.com/adafruit-arduino-lesson-2-leds/leds[/url]
[url=http://learn.adafruit.com/photocells/using-a-photocell]http://learn.adafruit.com/photocells/using-a-photocell[/url]

As the readings of the photocells require of analog inputs, we decided to use a spare arduino we had at our disposal, and to have more fun, use XBEE modules to connect the arduino to the raspberry pi.
[img=width: 300px; height: 225px;]/c/portal/getImageAttachment?filename=6471photo(2).JPG&amp;amp;userId=78214[/img]

The following is the code at the Arduino.[code]
#include &amp;lt;Servo.h&amp;gt; 
int servoPin = 9;
int LightSensor[4]={A1,A2,A3,A4};
int current[4]={0,0,0,0};
double timers[4]={0,0,0,0};
int previous[4]={0,0,0,0};

Servo servo;  
int i=0; 
double factor=0.7;
double timer = 0; 
int inByte =0;

void setup() 
{ 
  Serial.begin(19200);
  servo.attach(servoPin); 
  servo.write(0);
  delay(2000);
  for (i=0;i&amp;lt;4;i++){
    previous[i]=analogRead(LightSensor[i]);
  }
} 
void loop() 
{ 
if(Serial.available()&amp;gt;0){

  inByte=Serial.read();

 if (inByte==65) {
    servo.write(90);
    timer=millis();
    trackBall();
    for (i=0;i&amp;lt;4;i++){
      Serial.println(timers[i]);
    }
    servo.write(0);
  }
}
}

void trackBall(){
 int m =0;
  while (m&amp;lt;4){
    current[m]=analogRead(LightSensor[m]);
    if (current[m]&amp;lt;factor*previous[m]){
      timers[m]=millis()-timer;
      m++;
  }
  }
}[/code]
Mathematica code and results to follow
[mcode]serial = DeviceOpen[&amp;#034;Serial&amp;#034;, {&amp;#034;/dev/ttyUSB0&amp;#034;, &amp;#034;BaudRate&amp;#034; -&amp;gt; 19200}]
lengths = {0., 0.175, 0.43, 

   0.69} ;(*Distance between light sensors in meters*)
data = {};(*List to capture the time between sensors*)
ping := Module[{}, DeviceWriteBuffer[serial, {&amp;#034;A&amp;#034;}]; Pause[3]; 

  ToExpression[StringSplit[FromCharacterCode[DeviceReadBuffer[serial]]]]/1000]
(*Function pings sends an &amp;#034;A&amp;#034; to the arduino to release the ball and \

collect timing*)
Button[&amp;#034;Collect Data&amp;#034;, data = Append[data, ping]]
Dynamic[tdata = Transpose@data;

 dataPoints = 

  Flatten[Table[{tdata[[i, j]], lengths[[i]]}, {i, 

     Length[lengths]}, {j, Length[data]}], 1];

 lm = Fit[dataPoints, {1, x, x^2}, x];

 Plot[lm, {x, -0.3, 1.2}, 

  Epilog -&amp;gt; {Red, PointSize[Large], Point[dataPoints]}, 

  AspectRatio -&amp;gt; 1, PlotLabel -&amp;gt; lm]][/mcode]Using a hollow ball, we can work out the gravity value
[img=width: 440px; height: 500px;]/c/portal/getImageAttachment?filename=Untitled.png&amp;amp;userId=78214[/img]

Using a golf ball
[img=width: 500px; height: 482px;]/c/portal/getImageAttachment?filename=InclinedPlane.png&amp;amp;userId=78214[/img]

A couple of videos of the experiment in action
[url=https://www.youtube.com/watch?v=4eRZ48N3vM8]https://www.youtube.com/watch?v=4eRZ48N3vM8[/url]
[url=https://www.youtube.com/watch?v=eLVPpEcyw_0]https://www.youtube.com/watch?v=eLVPpEcyw_0[/url]
[url=https://www.youtube.com/watch?v=53WaEcRWY_0]https://www.youtube.com/watch?v=53WaEcRWY_0[/url]</description>
    <dc:creator>Diego Zviovich</dc:creator>
    <dc:date>2014-01-08T01:09:54Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/90931">
    <title>Connecting the Leap Motion controller to Mathematica Using JLink</title>
    <link>https://community.wolfram.com/groups/-/m/t/90931</link>
    <description>So the Leap Motion controller has just been released and I thought I might as well post some code that Todd Gayley and I wrote to connect the Leap Motion and all of its functionality to Mathematica!

Here are the setup instructions:


1. (If you haven&amp;#039;t already) download and install the leap motion
software from [b][url=http://www.leapmotion.com/setup]leapmotion.com/setup[/url][/b]

2. Download the leap motion SDK off their developer page.

3. Connect the Leap Motion to your computer.

4. Find the file &amp;#034;LeapJava.jar&amp;#034; inside of the Leap SDK (It should be under lib.)

5. Open Mathematica and paste this in:
[mcode]Needs[&amp;#034;JLink`&amp;#034;];
ReinstallJava[CommandLine -&amp;gt; &amp;#034;java&amp;#034;, 
  JVMArguments -&amp;gt; 
   &amp;#034;-Djava.library.path=[path to directory CONTAINING LeapJava.jar]&amp;#034;];[/mcode]Now put the path to &amp;#034;LeapJava.jar&amp;#034; in the indicated position, and remember this is the path to the directory containing LeapJava.jar, not the path to LeapJava.jar itself and run the code.

6. Now paste this in:[mcode]AddToClassPath[&amp;#034;[path to LeapJava.jar]&amp;#034;];[/mcode]Replace the indicated area with the path to LeapJava.jar itself and run it.

7. To setup the controller, run this:[mcode]controller = JavaNew[&amp;#034;com.leapmotion.leap.Controller&amp;#034;][/mcode]

You should now be fully connected and ready to go!  To see if it is working you can run:[mcode]Methods[controller][/mcode]and it should return a list of all the methods under the controller.

The main method you want to look at is frame[] which contains all of the information about what is going on.

To see the methods under frame[] run this:[mcode]Methods[controller@frame[]]
[/mcode]
From here you should be able to figure things out like this:
[mcode]controller@frame[]@fingers@count[][/mcode]which should count the visible fingers in the scene.

Also in a final note I want mention that I was having problems accessing the fingers with this:[mcode]controller@frame[]@finger[finger number][/mcode]and so you may want to try:[mcode]controller@frame[]@fingers[]@get[finger number][/mcode]instead.  The same goes for palms.

Well, if you write anything cool using this please post it back up here along with any questions you have.

(I am going to try to post some of the code I have written with this soon :D )</description>
    <dc:creator>Christopher Wolfram</dc:creator>
    <dc:date>2013-08-06T17:30:52Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/1179035">
    <title>Mike Foale&amp;#039;s Machine Learning Flight System</title>
    <link>https://community.wolfram.com/groups/-/m/t/1179035</link>
    <description>## Introduction ##&#xD;
A few weeks ago, I had the privilege of speaking with Michael Foale -- astronaut, astrophysicist, entrepreneur, and Mathematica fan -- about his recent work with the Wolfram Language on the Raspberry Pi.  As an experienced pilot, Mike thought he could use the Wolfram Language&amp;#039;s machine learning capabilities to alert pilots to abnormal or unsafe flying conditions.  Using a collection of sensors attached to a Pi, the Wolfram Language&amp;#039;s Classify function would constantly analyze things like altitude, velocity, roll, pitch, and yaw to determine whether the pilot might be about to lose control of the plane.  You can read [Mike&amp;#039;s User story here][1], his experience with [Mathematica on the Space Station Mir here][2]. and the full code for Mike&amp;#039;s Imminent Loss Of Control Identification system, or ILOCI, [is here][3].&#xD;
&#xD;
## Getting the Initial Data ##&#xD;
Mike decided that the Raspberry Pi would be a good base for this project because of how easy it is to connect various sensors to the device.  In his case, there were 3 sensors he wanted to read from:  an accelerometer, an aero sensor attached to the tail, and a flex sensor attached to the rudder.  All of these are attached via GPIO pins, which the Wolfram Language can communicate through.  In Mike&amp;#039;s case, he wrote a MathLink driver in C to do this, but since he began this project we have been pushing to make GPIO communication easier.  Now, the Wolfram Language can read from sensors with [the DeviceOpen and DeviceRead functions][4].  If Mike were to re-do this project, that would have saved him quite a bit of time and debugging!&#xD;
&#xD;
![Left: Prototype Pi and sensor setup.  Right: ILOCI attached to a small plane for a test flight.][5]&#xD;
&#xD;
Before the Classify function can tell whether current flying conditions are normal or not, it must first learn what normal is.  To do this, Mike took his Pi setup on a test-flight where he occasionally forced his plane to stall, or come close to stalling (please, do not try this for yourself!).  After landing, he manually separated the sensor readings into &amp;#034;in family&amp;#034; and &amp;#034;out of family&amp;#034; moments -- that is, normal flying conditions and moments where loss of control is imminent.&#xD;
&#xD;
## Importing the Data ##&#xD;
For Mike&amp;#039;s flights, he had the Pi save data from an initial test flight to a CSV file and had Mathematica import that file to train his Classify function on.  For example:&#xD;
&#xD;
    datafile = FileNameJoin[{&amp;#034;&amp;#034;,&amp;#034;home&amp;#034;,&amp;#034;pi&amp;#034;,&amp;#034;Documents&amp;#034;,&amp;#034;training.dat&amp;#034;}];&#xD;
    data = Import[datafile];&#xD;
&#xD;
For convenience&amp;#039;s sake, I&amp;#039;ve embedded this same data in a notebook attached to this post, so that you can test and manipulate this data on your own as well.  This data, saved under the variable &amp;#034;data&amp;#034;, was divided up by Mike into the &amp;#034;in family&amp;#034; and &amp;#034;out of family&amp;#034; periods mentioned earlier:&#xD;
&#xD;
    noloc1 = Take[data, {3500, 3800}];  (* Ground, engine off *)&#xD;
    noloc2 = Take[data, {3900, 4200}]; (* Ground, engine on *)&#xD;
    noloc3 = Take[data, {4500, 4600}]; (* Takeoff, climbing at 60 knots *)&#xD;
    noloc4 = Take[data, {4800, 4900}]; (* Climbing, 70 knots *)&#xD;
    noloc5 = Take[data, {5000, 5200}]; (* Climbing, 50 knots *)&#xD;
    noloc6 = Take[data, {6200, 6300}]; (* Gliding, 42 knots *)&#xD;
    noloc7 = Take[data, {6900, 7100}]; (* Climbing, 60 to 70 knots *)&#xD;
    noloc8 = Take[data, {9200, 9400}]; (* Landing *)&#xD;
    noloc9 = Take[data, {9300, 9400}]; (* Rolling out*)&#xD;
    loc1 = Take[data, {6450, 6458}]; (* Straight stall *)&#xD;
    loc2 = Take[data, {6480, 6484}]; (* Left stall *)&#xD;
    loc3 = Take[data, {6528, 6534}]; (* Right stall *)&#xD;
    loc4 = Take[data, {6693, 6700}]; (* Right Yaw *)&#xD;
    loc5 = Take[data, {6720, 6727}]; (* Left Yaw *)&#xD;
&#xD;
You might notice that the list above doesn&amp;#039;t cover the entire dataset.  That&amp;#039;s because some of the data is kept aside for verification, to ensure the Classify function is recognizing &amp;#034;in family&amp;#034; and &amp;#034;out of family&amp;#034; moments correctly.  This is a very important part of training any artificially intelligent program!  Below are some plots of the above datasets, just to get a visual feel for what the Pi is &amp;#034;seeing&amp;#034; during a flight.&#xD;
&#xD;
    ListLinePlot[Transpose[noloc4], &#xD;
     PlotLegends -&amp;gt; {&amp;#034;AccelerationZ&amp;#034;, &amp;#034;AccelerationY&amp;#034;, &amp;#034;AccelerationX&amp;#034;, &#xD;
       &amp;#034;RudderDeflection&amp;#034;, &amp;#034;ElevatorDeflection&amp;#034;, &amp;#034;TailYawRelative&amp;#034;, &#xD;
       &amp;#034;TailAngleAttackRelative&amp;#034;, &amp;#034;TailAirspeedRelative&amp;#034;}, Frame -&amp;gt; True]&#xD;
&#xD;
![Sensor readings while climbing at 70 knots][6]&#xD;
&#xD;
    ListLinePlot[Transpose[loc1], &#xD;
     PlotLegends -&amp;gt; {&amp;#034;AccelerationZ&amp;#034;, &amp;#034;AccelerationY&amp;#034;, &amp;#034;AccelerationX&amp;#034;, &#xD;
      &amp;#034;RudderDeflection&amp;#034;, &amp;#034;ElevatorDeflection&amp;#034;, &amp;#034;TailYawRelative&amp;#034;, &#xD;
      &amp;#034;TailAngleAttackRelative&amp;#034;, &amp;#034;TailAirspeedRelative&amp;#034;}, Frame -&amp;gt; True]&#xD;
&#xD;
![Sensor readings while stalled][7]&#xD;
&#xD;
These plots give us a good idea of what happens at some specific instances, but what does the flight as a whole look like?   Recall from above that takeoff begins around datapoint 4500, the first stall around 6450, landing around 9200, and the rollout ends around 9400.  According to Mike, the rudder movement around 11000 is simply moving the rudder to steer the aircraft back into the hangar.&#xD;
&#xD;
    ListLinePlot[Transpose[data], &#xD;
     PlotLegends -&amp;gt; {&amp;#034;AccelerationZ&amp;#034;, &amp;#034;AccelerationY&amp;#034;, &amp;#034;AccelerationX&amp;#034;, &#xD;
      &amp;#034;RudderDeflection&amp;#034;, &amp;#034;ElevatorDeflection&amp;#034;, &amp;#034;TailYawRelative&amp;#034;, &#xD;
      &amp;#034;TailAngleAttackRelative&amp;#034;, &amp;#034;TailAirspeedRelative&amp;#034;}, Frame -&amp;gt; True, &#xD;
     ImageSize -&amp;gt; Large]&#xD;
&#xD;
![Sensor readings from whole flight][8]&#xD;
&#xD;
## Training the Classifier Data ##&#xD;
After separating the data, Mike created Rules to label these moments as &amp;#034;Normal&amp;#034;, &amp;#034;Stall Response&amp;#034;, &amp;#034;Yaw Response Left&amp;#034;, and &amp;#034;Yaw Response Right&amp;#034;; respectively &amp;#034;N&amp;#034;, &amp;#034;D&amp;#034;, &amp;#034;L&amp;#034;, and &amp;#034;R&amp;#034;.  These Rules teach Classify which patterns belong to which label, so that later on Classify can tell what the appropriate label is for incoming, unlabeled data.  Note that the ConstantArray functions simply repeat the data 10 times so the &amp;#034;out of family&amp;#034; moments are not overshadowed by the &amp;#034;in family&amp;#034; ones.&#xD;
&#xD;
    normal = Join[noloc1, noloc2, noloc3, noloc4, noloc5, noloc6, noloc7, noloc8];&#xD;
    normalpairs = Rule[#1, &amp;#034;N&amp;#034;] &amp;amp; /@ normal;&#xD;
&#xD;
    down = Flatten[ConstantArray[Join[loc1, loc2, loc3], 10], 1];&#xD;
    downpairs = Rule[#1, &amp;#034;D&amp;#034;] &amp;amp; /@ down;&#xD;
&#xD;
    left = Flatten[ConstantArray[loc4, 10], 1];&#xD;
    leftpairs = Rule[#1, &amp;#034;L&amp;#034;] &amp;amp; /@ left;&#xD;
&#xD;
    right = Flatten[ConstantArray[loc5, 10], 1];&#xD;
    rightpairs = Rule[#1, &amp;#034;R&amp;#034;] &amp;amp; /@ right;&#xD;
&#xD;
Finally, with the data segmented and labeled, Mike created a ClassifierFunction able to take live data from the sensors, then quickly tell the pilot when something is wrong and how to correct it.&#xD;
&#xD;
    classify = Classify[Join[normalpairs, downpairs, leftpairs, rightpairs], Method -&amp;gt; {&amp;#034;NeuralNetwork&amp;#034;, PerformanceGoal -&amp;gt; &amp;#034;Quality&amp;#034;}]&#xD;
    ClassifierInformation[classify]&#xD;
&#xD;
![Output of the ClassifierInformation function][9]&#xD;
&#xD;
Let&amp;#039;s use this ClassifierFunction on a couple of the data points that we set aside earlier, to be sure the ClassifierFunction is correct, and to show how a ClassifierFunction is used.&#xD;
&#xD;
    nolocVerify = Take[data, {4600, 4700}];&#xD;
    locVerify = Take[data, {6587, 6595}];&#xD;
&#xD;
    classify[nolocVerify]&#xD;
&#xD;
    classify[locVerify]&#xD;
&#xD;
![Results of Mike&amp;#039;s classifier function on untrained data][10]&#xD;
&#xD;
Recall that the ClassifierFunction returns one of 4 labels for each data point:  &amp;#034;N&amp;#034; for normal, &amp;#034;L&amp;#034; for left yaw response, &amp;#034;R&amp;#034; for right yaw response, and &amp;#034;D&amp;#034; for down response.  Mike&amp;#039;s ClassifierFunction perfectly recognizes the first verification set, and comes close to being perfect on the second set.  Not bad, given how little data he had to train it with!&#xD;
&#xD;
This use-case gives a pretty good idea of how the ClassifierFunction works, but for a more in-depth example you can watch [this video from Wolfram Research][11].&#xD;
&#xD;
## Using the ClassifierFunction on Real Data ##&#xD;
At the moment I am neither a pilot nor the owner of an airplane, so performing a live test of Mike&amp;#039;s ClassifierFunction would be a bit challenging.  Fortunately, the Wolfram Language makes it easy to take Mike&amp;#039;s recorded data and re-run it as though the ClassifierFunction were receiving this data in real time.  First, we need to import the data that Mike recorded in his test flight.  We actually did this already, when we called Import on the file containing the data.  Next we&amp;#039;ll import the timing data from the test flight.  This is the absolute time in seconds and microseconds from the beginning and end of the flight, so subtracting the beginning time from the end time gives us the total time of the flight.  With the times and the data known, we can determine how often the Pi read in measurements on the test flight.  We will use that frame time to &amp;#034;replay&amp;#034; the test flight accurately.  Mike&amp;#039;s ILOCI system records timing information that we can Import from, but again I&amp;#039;ll include it here for the sake of convenience:&#xD;
&#xD;
    timing = {1466861802, 255724, 1466867826, 498879, 11660};&#xD;
    dataseconds = timing[[3]] - timing[[1]];&#xD;
    datausecs = timing[[4]] - timing[[2]];&#xD;
    frametime = (dataseconds + datausecs*1.0*^-6)/(Length[data] - 1)&#xD;
&#xD;
Now, let&amp;#039;s use that frame length to create an animated output of the ClassifierFunction.  This goes through the whole dataset and runs at the same rate as the Pi did.  If we were actually flying an airplane, this would show us what the Pi thinks about our current environment, whether our motion is &amp;#034;in-family&amp;#034; or &amp;#034;out-of-family&amp;#034;.&#xD;
&#xD;
    Animate[&#xD;
     classify[ data[[frame]] ],&#xD;
     {frame, 1, Length[data], 1},&#xD;
     AnimationRate -&amp;gt; (1/frametime),&#xD;
     FrameLabel -&amp;gt; {{None, None}, {None, &amp;#034;Full Flight Playback&amp;#034;}}&#xD;
    ]&#xD;
&#xD;
This would take a little over 90 minutes to run, and the non-normal readings go by fairly quickly, so let&amp;#039;s focus in on some of the more interesting sections.  First, let&amp;#039;s see the straight stall that was reserved for verification.  Again, the ClassifierFunction was not trained using this set -- it is brand new as far as the Classifier is concerned.&#xD;
&#xD;
    Animate[&#xD;
     classify[ data[[frame]] ],&#xD;
     {frame, 6580, 6595, 1},&#xD;
     AnimationRate -&amp;gt; (1/frametime),&#xD;
     FrameLabel -&amp;gt; {{None, None}, {None, &amp;#034;Stall Playback&amp;#034;}}&#xD;
    ]&#xD;
&#xD;
![Animate of the stall playback][12]&#xD;
&#xD;
Notice that the Classifier constantly reads the situation and updates it&amp;#039;s classification accordingly.  Next let&amp;#039;s look at one of the verification sets where everything was normal.  It should read as &amp;#034;N&amp;#034; for the entire set:&#xD;
&#xD;
    Animate[&#xD;
     classify[ data[[frame]] ],&#xD;
     {frame, 6300, 6400, 1},&#xD;
     AnimationRate -&amp;gt; (1/frametime),&#xD;
     FrameLabel -&amp;gt; {{None, None}, {None, &amp;#034;Normal Playback&amp;#034;}}&#xD;
    ]&#xD;
&#xD;
![Animate of the normal playback][13]&#xD;
&#xD;
## Conclusion ##&#xD;
If we go back and count, there is about 60 lines of code. That&amp;#039;s all that was needed to create plots, animations, and a neural net-based Classifier that might one day save lives.  This is what makes the Wolfram Language such a powerful choice for projects like this -- quick prototyping and a plethora of built-in functions allows users to create some truly unique projects, regardless of experience or expertise.  We hope that this will inspire you to start experimenting with your own ideas with the Wolfram Language!&#xD;
&#xD;
&#xD;
  [1]: http://www.wolfram.com/mathematica/customer-stories/training-a-neural-network-to-think.html&#xD;
  [2]: http://www.wolfram.com/mathematica/customer-stories/astronaut-places-a-customer-service-call-to-wolfram-research-from-space-station-mir.html&#xD;
  [3]: https://github.com/cfoale/ILOCI&#xD;
  [4]: http://reference.wolfram.com/language/guide/UsingConnectedDevices.html&#xD;
  [5]: http://community.wolfram.com//c/portal/getImageAttachment?filename=PiAndPlaneSetup.png&amp;amp;userId=313765&#xD;
  [6]: http://community.wolfram.com//c/portal/getImageAttachment?filename=LinePlot1.png&amp;amp;userId=313765&#xD;
  [7]: http://community.wolfram.com//c/portal/getImageAttachment?filename=LinePlot2.png&amp;amp;userId=313765&#xD;
  [8]: http://community.wolfram.com//c/portal/getImageAttachment?filename=LinePlot3.png&amp;amp;userId=313765&#xD;
  [9]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ClassifierInfo.png&amp;amp;userId=313765&#xD;
  [10]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ClassifierResults.png&amp;amp;userId=313765&#xD;
  [11]: https://youtu.be/ce6UptPYKxI?t=20m40s&#xD;
  [12]: http://community.wolfram.com//c/portal/getImageAttachment?filename=StallPlayback.gif&amp;amp;userId=313765&#xD;
  [13]: http://community.wolfram.com//c/portal/getImageAttachment?filename=NormalPlayback.gif&amp;amp;userId=313765</description>
    <dc:creator>Brett Haines</dc:creator>
    <dc:date>2017-09-07T18:16:06Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/992466">
    <title>Classifier for Human Motions with data from an accelerometer</title>
    <link>https://community.wolfram.com/groups/-/m/t/992466</link>
    <description>This project was part of a Wolfram Mentorship Program.&#xD;
&#xD;
The classification of human motions based on patterns and physical data is of great importance in developing areas such as robotics. Also, a function that recognizes a specific human motion can be an important addition to artificial intelligence and physiological monitoring systems. This project is about acquiring, curating and analyzing experimental data from certain actions such as walking, running and climbing stairs. The data taken with the help of an accelerometer needs to be turned into an acceptable input for the Classify function. Finally, the function can be updated with more data and classes to make it more efficient and whole.&#xD;
&#xD;
**Algorithms and procedures**&#xD;
&#xD;
The data for this project was acquired by programming an Arduino UNO microprocessor with a Raspberry Pi computer, using Wolfram Language. An accelerometer connected to the Arduino sent measurements each time it was called upon, and Mathematica in the Raspberry Pi collected and uploaded the data. &#xD;
The raw data had to be processed for it to be a good input for the classify function. First, it was transformed into an spectrogram (to analyze the frequency domain of the data). Then, the spectrogram&amp;#039;s image was put through the IFData function which filters out some of the noise, and finally the images were converted into numerical data with the UpToMeasurements function (main function: ComponentMeasurements).&#xD;
This collection numerical data was put in a classify function under six different classes (standing, walking, running, jumping and waving).&#xD;
&#xD;
*The IFData function and the UpToMeasurements functions were sent to me by Todd Rowland during the Mentorship. Both functions will be shown at the end of this post.&#xD;
&#xD;
**Example visualization**&#xD;
&#xD;
The following ListLinePlot is an extract from the jumping data &#xD;
&#xD;
![Example data][1]&#xD;
&#xD;
Next, the data from the plot above is turned into a spectrogram by the function Spectrogram, i.e.:   &#xD;
&#xD;
    spectrogramImage = &#xD;
     Spectrogram[jumpingData, SampleRate -&amp;gt; 10, FrameTicks -&amp;gt; None, &#xD;
      Frame -&amp;gt; False, Ticks -&amp;gt; None, FrameLabel -&amp;gt; None]&#xD;
&#xD;
&#xD;
&#xD;
![Example jumping data spectrogram][2]&#xD;
&#xD;
Finally, all the spectrogram images are used as input for the UpToMeasurements function, along with some properties for the ComponentMeasurements function:&#xD;
&#xD;
 i.e:  &#xD;
&#xD;
    numericalData = &#xD;
     N@Flatten[&#xD;
       UpToMeasurements[&#xD;
        spectrogramImage, {&amp;#034;EnclosingComponentCount&amp;#034;, &amp;#034;Max&amp;#034;, &#xD;
         &amp;#034;MaxIntensity&amp;#034;, &amp;#034;TotalIntensity&amp;#034;, &amp;#034;StandardDeviationIntensity&amp;#034;, &#xD;
         &amp;#034;ConvexCoverage&amp;#034;, &amp;#034;Total&amp;#034;, &amp;#034;Skew&amp;#034;, &amp;#034;FilledCircularity&amp;#034;, &#xD;
         &amp;#034;MaxCentroidDistance&amp;#034;, &amp;#034;ExteriorNeighborCount&amp;#034;, &amp;#034;Area&amp;#034;, &#xD;
         &amp;#034;MinCentroidDistance&amp;#034;, &amp;#034;FilledCount&amp;#034;, &amp;#034;MeanIntensity&amp;#034;, &#xD;
         &amp;#034;StandardDeviation&amp;#034;, &amp;#034;Energy&amp;#034;, &amp;#034;Count&amp;#034;, &amp;#034;MeanCentroidDistance&amp;#034;}, &#xD;
        1]]&#xD;
&#xD;
Which outputs a list of real numbers, one for each of the properties:&#xD;
&#xD;
    {0., 1., 1., 1., 1., 19294.9, 0.222164, 0.985741, 31011.8, 15212.5, \&#xD;
    9624.42, -0.0596506, 0.724527, 190.534, 0., 42584.5, 0.364667, \&#xD;
    42584., 0.453101, 0.315209, 0.232859, 0.169549, 0.00909654, 42584., \&#xD;
    98.7136}&#xD;
&#xD;
These numbers are grouped in a nested list which contains data for all 5 human motions. All the data is lastly classified in a classifier using the Classify function.&#xD;
&#xD;
After several combinations of both properties and data sets, I was able to produce classifier functions with an accuracy of 91%, and a total size of 269kb. &#xD;
&#xD;
------------------------------------------------------------&#xD;
&#xD;
**Attempt on building a classify function using image processing**&#xD;
&#xD;
On the other hand,  the image processing capabilities of Mathematica lets us extract data from images, hence it should be possible to create a classifier which recognizes the moving patterns in the frames of a video. First, I had to take the noise out of every image, this proved to be troublesome, since the background can vary greatly between video samples. Then, I binarized the image in order to isolate the moving particles in each frame, and extract their position with ImageData. Lastly, a data set can be formed from all the analyzed frames; this data can essentially be used in the same way as the accelerometer&amp;#039;s, but the classifier was unsuccessful in separating the samples accurately. &#xD;
This was mainly because the accelerometer&amp;#039;s data is taken at a constant rate and very precisely, whereas the images depend on the camera&amp;#039;s frame rate, and many other external factors. This is what made the data different enough to fail being classified with accuracy. Furthermore, if a big dataset is made from videos of people performing certain actions, the data processing can follow similar steps as the ones explained in this report. Thus producing a similar classifier function. This can further increase the functions accuracy, but the process needs an algorithm that can effectively trace the path of &amp;#034;a particle&amp;#034; that moves through each of the frames of the video, and extract precise velocity data from said movement.&#xD;
&#xD;
------------------------------------------------------------&#xD;
&#xD;
Conclusively, the classify function is working very well with the data provided, its accuracy is about 91% for the SupportVectorMachine method. This is a very good result for the human motion classifier. The next step is to add more classes to the function, and test the classifier with data acquired from different sources, such as another accelerometer and various videos of human motion footage.&#xD;
&#xD;
-----------------------------------------&#xD;
&#xD;
**Code:**&#xD;
&#xD;
 - UpToMeasurements function&#xD;
&#xD;
        UpToMeasurements[image_,property_,n_]:=MaximalBy[ComponentMeasurements[image,&amp;#034;Count&amp;#034;],Last,UpTo[n]][[All,1]]/.ComponentMeasurements[image,property]&#xD;
&#xD;
*Note: This function simplifies the exploration of properties to input in ComponentMeasurements, also, it outputs a usable list of numerical data retrieved from a given group of images.&#xD;
&#xD;
 - IFData function:&#xD;
&#xD;
        imagefunctions=&amp;lt;|1-&amp;gt; (EntropyFilter[#,3]&amp;amp;),&#xD;
        2-&amp;gt; (EdgeDetect[EntropyFilter[#,3]]&amp;amp;),&#xD;
        3-&amp;gt;Identity,&#xD;
        4-&amp;gt; (ImageAlign[reference110,#]&amp;amp;),&#xD;
        5-&amp;gt; (ImageHistogram[#,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,FrameLabel-&amp;gt;None,Ticks-&amp;gt;None]&amp;amp;),&#xD;
        6-&amp;gt; (ImageApply[#^.6&amp;amp;,#]&amp;amp;),&#xD;
        7-&amp;gt; (Colorize[MorphologicalComponents[#]]&amp;amp;),&#xD;
        8-&amp;gt; (HighlightImage[#,ImageCorners[#,1,.001,5]]&amp;amp;),&#xD;
        9-&amp;gt; (HighlightImage[#,Graphics[Disk[{200,200},200]]]&amp;amp;),&#xD;
        10-&amp;gt; ImageRotate,&#xD;
        11-&amp;gt; (ImageRotate[#,45Degree]&amp;amp;),&#xD;
        12-&amp;gt;(ImageTransformation[#,Sqrt]&amp;amp;),&#xD;
        13-&amp;gt;(ImageTransformation[#,Function[p,With[{C=150.,R=35.},{p[[1]]+(R*Cos[(p[[1]]-C)*360*2/R]/6),p[[2]]}]]]&amp;amp;),&#xD;
        14-&amp;gt;( Dilation[#,DiskMatrix[4]]&amp;amp;),&#xD;
        15-&amp;gt;( ImageSubtract[Dilation[#,1],#]&amp;amp;),&#xD;
        16-&amp;gt; (Erosion[#,DiskMatrix[4]]&amp;amp;),&#xD;
        17-&amp;gt; (Opening[#,DiskMatrix[4]]&amp;amp;),&#xD;
        18-&amp;gt;(Closing[#,DiskMatrix[4]]&amp;amp;),&#xD;
        19-&amp;gt;DistanceTransform,&#xD;
        20-&amp;gt; InverseDistanceTransform,&#xD;
        21-&amp;gt; (HitMissTransform[#,{{1,-1},{-1,-1}}]&amp;amp;),&#xD;
        22-&amp;gt;(TopHatTransform[#,5]&amp;amp;),&#xD;
        23-&amp;gt;(BottomHatTransform[#,5]&amp;amp;), &#xD;
        24-&amp;gt; (MorphologicalTransform[Binarize[#],Max]&amp;amp;),&#xD;
        25-&amp;gt; (MorphologicalTransform[Binarize[#],&amp;#034;EndPoints&amp;#034;]&amp;amp;),&#xD;
        26-&amp;gt;MorphologicalGraph,&#xD;
        27-&amp;gt;SkeletonTransform,&#xD;
        28-&amp;gt;Thinning,&#xD;
        29-&amp;gt;Pruning,&#xD;
        30-&amp;gt; MorphologicalBinarize,&#xD;
        31-&amp;gt; (ImageAdjust[DerivativeFilter[#,{1,1}]]&amp;amp;),&#xD;
        32-&amp;gt; (GradientFilter[#,1]&amp;amp;),&#xD;
        33-&amp;gt; MorphologicalPerimeter,&#xD;
        34-&amp;gt; Radon&#xD;
        |&amp;gt;;&#xD;
        &#xD;
        reference110=BlockRandom[SeedRandom[&amp;#034;110&amp;#034;];Image[CellularAutomaton[110,RandomInteger[1,400],400]]];&#xD;
        &#xD;
        IFData[n_Integer]:=Lookup[imagefunctions,n,Identity]&#xD;
        &#xD;
        IFData[&amp;#034;Count&amp;#034;]:=Length[imagefunctions]&#xD;
        &#xD;
        IFData[All]:=imagefunctions&#xD;
&#xD;
*Note: This function groups together several image filtering fuctions; it was used to simplify the exploration of functions to be used in the classifier. &#xD;
**This function was written by the the Wolfram team, but was slightly modified for this project.&#xD;
&#xD;
 - propertyVector function (this function automatically evaluates all the prior necessary code needed to create the classify functions):&#xD;
&#xD;
        propertyVector[property_]:={walkingvector=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@walk);&#xD;
        jumpingvector=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@jump);&#xD;
        standingvector=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@stand);&#xD;
        runningvector=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@run);&#xD;
        wavingvector=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@wave);&#xD;
        stairsvector=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@stairs);&#xD;
        walkingvectortest=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@testwalk);&#xD;
        jumpingvectortest=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@testjump);&#xD;
        standingvectortest=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@teststand);&#xD;
        runningvectortest=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@testrun);&#xD;
        wavingvectortest=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@testwave);&#xD;
        stairsvectortest=N@Flatten[UpToMeasurements[#,property,1]]&amp;amp;/@IFData[6]/@(Spectrogram[#,SampleRate-&amp;gt;10,FrameTicks-&amp;gt;None,Frame-&amp;gt;False,Ticks-&amp;gt;None,FrameLabel-&amp;gt;None]&amp;amp;/@teststairs);}&#xD;
        &#xD;
        Training:=trainingSet=&amp;lt;|&amp;#034;walking&amp;#034;-&amp;gt;walkingvector,&amp;#034;running&amp;#034;-&amp;gt;runningvector,&#xD;
        &amp;#034;standing&amp;#034;-&amp;gt; standingvector,&#xD;
        &amp;#034;jumping&amp;#034;-&amp;gt; jumpingvector,&#xD;
        &amp;#034;waving&amp;#034;-&amp;gt; wavingvector,&#xD;
        &amp;#034;stairs&amp;#034;-&amp;gt; stairsvector|&amp;gt;;&#xD;
        &#xD;
        Test:=testSet=&amp;lt;|&amp;#034;walking&amp;#034;-&amp;gt;walkingvectortest,&amp;#034;running&amp;#034;-&amp;gt;runningvectortest,&#xD;
        &amp;#034;standing&amp;#034;-&amp;gt; standingvectortest,&#xD;
        &amp;#034;jumping&amp;#034;-&amp;gt; jumpingvectortest,&#xD;
        &amp;#034;waving&amp;#034;-&amp;gt; wavingvectortest,&#xD;
        &amp;#034;stairs&amp;#034;-&amp;gt; stairsvectortest|&amp;gt;;&#xD;
&#xD;
 - Example code for the acceleration data acquisition from image processing:&#xD;
&#xD;
        images=Import[&amp;#034;$path&amp;#034;]&#xD;
        motionData=&#xD;
        Count[#,1]&amp;amp;/@ &#xD;
          (Flatten[    	&#xD;
          	ImageData[Binarize[ImageSubtract[ImageSubtract[#[[1]],#[[2]]],ImageSubtract[#[[2]],#[[3]]]]]]&amp;amp;/@&#xD;
        		  Partition[images,3,1],1])&#xD;
&#xD;
*Note: before this code can be used, the backgrounds of the frames of the video have to be removed, and the image has to be binarized as much as possible (some examples will be shown in the next section).&#xD;
&#xD;
 - Example code for the retrieval of raw data from DataDrop:&#xD;
&#xD;
        rawData=Values[Databin[&amp;#034;Serial#&amp;#034;, {#}]];&#xD;
        data=Flatten[rawData[&amp;#034;(xacc/yacc/zacc)&amp;#034;]];&#xD;
&#xD;
---------------------------------&#xD;
&#xD;
**Please feel free to contact me or comment if you are interested in the rest of the code ( uploading the C code to the Arduino, the manufacturer&amp;#039;s code for the accelerometer, C code switch that lets Mathematica communicate with the Arduino, and the Wolfram Language code used to start each loop in the switch that retrieves data ). Also, I could send the classify function, or any other information that I might have left out; all suggestions welcome.&#xD;
&#xD;
  [1]: http://community.wolfram.com//c/portal/getImageAttachment?filename=1.png&amp;amp;userId=602285&#xD;
  [2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=2.png&amp;amp;userId=602285</description>
    <dc:creator>Pablo Ruales</dc:creator>
    <dc:date>2017-01-11T01:15:04Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/825781">
    <title>Mathematica slow down on Raspberry Pi 3</title>
    <link>https://community.wolfram.com/groups/-/m/t/825781</link>
    <description>Just benchmarked my RPI3 vs RPI2 Model B V1.1 with the built in Mathematica Benchmark[]. I was quite surprised to see the benchmark significantly decline. RPI3 had benchmark of 0.03 and time of  465 while the RPI2 had benchmark of 0.045 and time of 305 sec. On the RPI3, two tests dominated the time, matrix multiply and solving linear systems which took only slightly less than the entire benchmark on RPI2. Both systems were fully updated. I wonder if Wolfram could comment on this : do they feel it is the hardware or this might be improved in a future release?. As it is it is difficult to use Mathematica on an RPI and I was looking to RPI3 to improve this.&#xD;
Thanks</description>
    <dc:creator>david p</dc:creator>
    <dc:date>2016-03-19T02:02:04Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/1098055">
    <title>The periodic table - powered by Wolfram Language</title>
    <link>https://community.wolfram.com/groups/-/m/t/1098055</link>
    <description>Recently, I&amp;#039;ve been working on a project called Mandy, an interactive periodic table that displays different element trends depending on what the user says to her.  The Raspberry Pi/Mathematica duo has played a very strong role in all aspects of the project design and implementation.&#xD;
&#xD;
I&amp;#039;m not allowed to provide too many details about the project (not because it&amp;#039;s secret, but because I am moving in a month and my wife has ordered that all of my toys and hobbies get packed or I risk them being left behind).  Therefore, I [created a teaser trailer][2] showcasing the design with a promise to provide more details when I get settled in my new location.&#xD;
&#xD;
[![enter image description here][3]][2]&#xD;
&#xD;
[![enter image description here][4]][2]&#xD;
&#xD;
That said, I wanted to highlight a couple of areas where Mathematica played a pivotal role in the the project.  My goal was to create a periodic table display (approximately 24x18&amp;#034;) that has a RGB LED for each element.  The color of the element would then be based on a given periodic trend (atomic radius, weight, ionization energy, etc.).  Controlling 118 3-color LEDs turns out to be very easy when the LEDs are Neopixels and the controller is an Arduino.  Because I envisioned a wall display, I wanted the user to interact with the piece in some fashion other than a mouse or keyboard.  I have started working on a voice recognition system based on pocketsphinx which I call [Simplified Command and Control - SCAC](https://bobthechemist.com/2015/12/prelude-to-simplified-command-and-control/) but since it is a C/Python project, I&amp;#039;ll leave that component for another forum.  In summary, the final project requires that SCAC (a python script) interact with Mathematica (data manipulation) that then speaks to an Arduino via a serial connection.  But Mathematica played a big role prior to the implementation as well:&#xD;
&#xD;
## Design&#xD;
&#xD;
- With access to `ElementData`, I was able to very quickly create an image that could be sent to a laser cutter for carving the birch-wood frame and the acrylic element pieces.&#xD;
&#xD;
        o = Table[&#xD;
           Map[ElementData[i, #] &amp;amp;, {&amp;#034;AtomicNumber&amp;#034;, &amp;#034;Symbol&amp;#034;, &amp;#034;Period&amp;#034;, &#xD;
             &amp;#034;Group&amp;#034;}], {i, 118}];&#xD;
        (* Need to massage the f-block elements,giving them fake groups and \&#xD;
        periods.Making their periods 9 and 10 with their groups 3 through 16 \&#xD;
        works nicely *)&#xD;
        o[[57 ;; 70]] = Module[{i = 1, rep = Range[3, 16], tmp},&#xD;
           tmp = Select[o, 57 &amp;lt;= #[[1]] &amp;lt;= 70 &amp;amp;] /. {6 -&amp;gt; 9};&#xD;
           tmp /. {Missing[&amp;#034;NotApplicable&amp;#034;] :&amp;gt; rep[[i++]]}];&#xD;
        o[[89 ;; 102]] = Module[{i = 1, rep = Range[3, 16], tmp},&#xD;
           tmp = Select[o, 89 &amp;lt;= #[[1]] &amp;lt;= 102 &amp;amp;] /. {7 -&amp;gt; 10};&#xD;
           tmp /. {Missing[&amp;#034;NotApplicable&amp;#034;] :&amp;gt; rep[[i++]]}];&#xD;
        o = o /. {&amp;#034;Uut&amp;#034; -&amp;gt; &amp;#034;Nh&amp;#034;, &amp;#034;Uup&amp;#034; -&amp;gt; &amp;#034;Mc&amp;#034;, &amp;#034;Uus&amp;#034; -&amp;gt; &amp;#034;Ts&amp;#034;, &amp;#034;Uuo&amp;#034; -&amp;gt; &amp;#034;Og&amp;#034;};&#xD;
        piece = Polygon[{{0, 0}, {1, 0}, {1, 1}, {0, 1}, {0, 0}}];&#xD;
        Clear[box]&#xD;
        box[array_] := Module[{x, y, m = 10},&#xD;
          {x, y} = array[[{4, 3}]];&#xD;
          {FaceForm[None], EdgeForm[Thin],&#xD;
           Rectangle[{m x, m (10 - y)}, {m (x + 1), m (11 - y)}]&#xD;
           (*Inset[Style[array[[2]],10,Bold],{m (x+0.5),m(10.7-y)}],&#xD;
           Inset[Style[array[[1]],8],{m(x+0.5),m(10.2-y)}]*)}&#xD;
          ]&#xD;
        makePiece[pt_] := GeometricTransformation[&#xD;
           GeometricTransformation[piece, {pt[[1]], 10 - pt[[2]]}*{1.2, 1.2}],&#xD;
           ScalingTransform[{10, 10}]];&#xD;
        ptpuzzle = &#xD;
          Graphics[{EdgeForm[Thin], FaceForm[None], &#xD;
            makePiece /@ o[[All, {4, 3}]]}];&#xD;
        Clear[letters2]&#xD;
        letters2[array_] := Module[{x, y, m = 10},&#xD;
          {x, y} = array[[{4, 3}]];&#xD;
          {FaceForm[None], EdgeForm[Thin],&#xD;
           (*Rectangle[{m x,m(10-y)},{m(x+1),m(11-y)}]*)&#xD;
           Inset[Style[array[[2]], 8, Bold, &#xD;
             FontFamily -&amp;gt; &amp;#034;Cambria Math&amp;#034;], {m (x + 0.45), &#xD;
              m (10.55 - y)}*{1.2, 1.2}],&#xD;
           Inset[Style[array[[1]], 6, &#xD;
             FontFamily -&amp;gt; &amp;#034;Cambria Math&amp;#034;], {m (x + 0.45), m (10.2 - y)}*{1.2,&#xD;
               1.2}]}&#xD;
          ]&#xD;
        ptpuzzlelt = letters2 /@ o // Graphics;&#xD;
        Show[ptpuzzlelt, ptpuzzle, ImageSize -&amp;gt; 600]&#xD;
&#xD;
![enter image description here][5]&#xD;
&#xD;
There are easier ways to create a periodic table, but the above method allowed me to create SVG images suitable for tweaking in vector graphics software and cut with the laser cutter.&#xD;
&#xD;
## Data&#xD;
&#xD;
Naturally, `ElementData` can provide the physical and chemical properties of the elements that I want to display on Mandy.  There&amp;#039;s nothing inspiring about this code (grabbing the data, rescaling it and converting values to a corresponding color scheme).  Mathematica provided a useful platform for sandboxing what the trends would look like:&#xD;
&#xD;
![enter image description here][6]&#xD;
&#xD;
## Implementation&#xD;
&#xD;
Since speech recognition (SCAC) is written in Python, I needed to control a Mathematica Kernel from within Python.  I&amp;#039;ve [played with this idea before](https://github.com/bobthechemist/python-mathlink) which results in a functioning platform that is error-intolerant (READ: not ready for prime time).  Communication with the Arduino is done through a &amp;#034;Serial&amp;#034; device instead of the &amp;#034;Arduino&amp;#034; device because I started this project before the latter was working.  That said, it was pretty straightforward to create a Mathematica Package that (a) opens serial communication with the Arduino, (b) Reads in the element-LED data (c) sends a command to the Arduino to light the LEDs.&#xD;
&#xD;
I plan to post more details about the project, including code and design pictures, in due time.  See [my website](https://bobthechemist.com/2017/05/mandy-the-periodic-table-teaser/) for updates.  RIght now, it sounds like I&amp;#039;ve used up my daily allocation of blogging time and have to go pack some boxes.&#xD;
&#xD;
&#xD;
  [1]: http://community.wolfram.com//c/portal/getImageAttachment?filename=trends.gif&amp;amp;userId=61884&#xD;
  [2]: https://www.youtube.com/watch?v=eI-IgJ3n_RU&#xD;
  [3]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-08-04at7.37.04AM.png&amp;amp;userId=11733&#xD;
  [4]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-08-04at7.37.59AM.png&amp;amp;userId=11733&#xD;
  [5]: http://community.wolfram.com//c/portal/getImageAttachment?filename=Capture.JPG&amp;amp;userId=61884&#xD;
  [6]: http://community.wolfram.com//c/portal/getImageAttachment?filename=trends.gif&amp;amp;userId=61884</description>
    <dc:creator>BoB LeSuer</dc:creator>
    <dc:date>2017-05-18T15:11:29Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/553861">
    <title>Wolfram Data Drop IFTTT Channel: track your elevation and much more</title>
    <link>https://community.wolfram.com/groups/-/m/t/553861</link>
    <description>Yesterday [IFTTT][1] (IF This Then That) released the [Wolfram Data Drop Channel][2]. This is a BIG step forward in making the data from the Internet of Things computable. Currently there are more than [150 trigger channels][3] that can be connected to [Data Drop][4]. Let&amp;#039;s take a look at one of those channels: [Numerous][5]. In this post I&amp;#039;ll show you how to get a Numerous recipe running in a few simple steps.&#xD;
&#xD;
- Sign in to [https://ifttt.com][6]. Go to My Recipes. Click &amp;#034;Create Recipe&amp;#034;&#xD;
&#xD;
![Click Create Recipe][7]&#xD;
&#xD;
- Step 1: Select the Numerous Trigger Channel&#xD;
&#xD;
![Choose Trigger][8]&#xD;
&#xD;
 - Step 2: Select the trigger that fires every time a number changes by any amount&#xD;
&#xD;
![chooseType][9]&#xD;
&#xD;
 - Step 3: Select the number that you want to track, elevation in my case&#xD;
&#xD;
![Select number][10]&#xD;
&#xD;
 - Step 4: Select Wolfram Data Drop as Your Action Channel&#xD;
&#xD;
![then action][11]&#xD;
![data drop channel][12]&#xD;
&#xD;
 - Step 5: Select &amp;#034;Add entry&amp;#034; Action&#xD;
&#xD;
![add entry][13]&#xD;
&#xD;
 - Step 6: Complete Action Fields&#xD;
&#xD;
![action fields][14]&#xD;
&#xD;
Use the Wolfram Language function [CreateDatabin][15] to provide a name, and to specify that we want the values of the entries to be interpreted as a physical quantities (feet in this case).&#xD;
&#xD;
    bin = CreateDatabin[&amp;lt;|&amp;#034;Name&amp;#034; -&amp;gt; &amp;#034;My Elevation&amp;#034;|&amp;gt;, &#xD;
    &amp;#034;Interpretation&amp;#034; -&amp;gt; {&amp;#034;elevation&amp;#034; -&amp;gt;Restricted[&amp;#034;StructuredQuantity&amp;#034;,&amp;#034;Feet&amp;#034;]}];&#xD;
    bin[&amp;#034;ShortID&amp;#034;]&#xD;
**&amp;#034;6F9_LE8T&amp;#034;**&#xD;
&#xD;
 - Copy-paste this ID&#xD;
&#xD;
 - Write *elevation=* and select the ingredient **FormattedValue**.&#xD;
&#xD;
![ingredient][17]&#xD;
&#xD;
    elevation={{FormattedValue}}&#xD;
&#xD;
 - Step 7: Create and Connect&#xD;
&#xD;
![create][18]&#xD;
&#xD;
Voilà!&#xD;
======&#xD;
![Numerous recipe][19]&#xD;
&#xD;
Now you can take an insight of the elevation measurements with [Wolfram|Alpha][20] by entering *Data drop* along with your databin&amp;#039;s *ID*.&#xD;
**Data drop 6F9_LE8T** in my case.&#xD;
&#xD;
![WA][21]&#xD;
&#xD;
Or you can access this data directly from the Wolfram Language:&#xD;
&#xD;
    DateListPlot[Databin[&amp;#034;6F9_LE8T&amp;#034;], Filling -&amp;gt; Bottom, FillingStyle -&amp;gt; LightBrown]&#xD;
&#xD;
![WLelevation][22]&#xD;
&#xD;
Please, feel free to share your own recipes below. Enjoy!&#xD;
&#xD;
&#xD;
  [1]: https://ifttt.com/wtf&#xD;
  [2]: https://ifttt.com/wolfram_data_drop&#xD;
  [3]: https://ifttt.com/channels&#xD;
  [4]: https://datadrop.wolframcloud.com/&#xD;
  [5]: https://ifttt.com/numerous&#xD;
  [6]: https://ifttt.com&#xD;
  [7]: /c/portal/getImageAttachment?filename=createRecipe.png&amp;amp;userId=56204&#xD;
  [8]: /c/portal/getImageAttachment?filename=1.png&amp;amp;userId=56204&#xD;
  [9]: /c/portal/getImageAttachment?filename=2v.png&amp;amp;userId=56204&#xD;
  [10]: /c/portal/getImageAttachment?filename=2.png&amp;amp;userId=56204&#xD;
  [11]: /c/portal/getImageAttachment?filename=3.png&amp;amp;userId=56204&#xD;
  [12]: /c/portal/getImageAttachment?filename=3v.png&amp;amp;userId=56204&#xD;
  [13]: /c/portal/getImageAttachment?filename=4.png&amp;amp;userId=56204&#xD;
  [14]: /c/portal/getImageAttachment?filename=6v1.png&amp;amp;userId=56204&#xD;
  [15]: https://reference.wolfram.com/language/ref/CreateDatabin.html&#xD;
  [16]: /c/portal/getImageAttachment?filename=6v4.png&amp;amp;userId=56204&#xD;
  [17]: /c/portal/getImageAttachment?filename=6v5.png&amp;amp;userId=56204&#xD;
  [18]: /c/portal/getImageAttachment?filename=7.png&amp;amp;userId=56204&#xD;
  [19]: /c/portal/getImageAttachment?filename=recipe.png&amp;amp;userId=56204&#xD;
  [20]: https://www.wolframalpha.com/input/?i=Data+drop+6F9_LE8T&#xD;
  [21]: /c/portal/getImageAttachment?filename=WAelev.png&amp;amp;userId=56204&#xD;
  [22]: /c/portal/getImageAttachment?filename=WLelevation.png&amp;amp;userId=56204</description>
    <dc:creator>Bernat Espigulé</dc:creator>
    <dc:date>2015-08-26T16:55:04Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/408056">
    <title>Raspberry Pi goes to school: Wolfram Language programming in K5</title>
    <link>https://community.wolfram.com/groups/-/m/t/408056</link>
    <description>Learning to code and thinking computationally is becoming more and more popular. Ive seen it filter down into the elementary schools. Thats why I didnt hesitate when I had the opportunity to volunteer at [Kenwood Elementary school][1]. Kenwood is transitioning to becoming a computational lab school, where programming is as common as writing. I helped out at their afterschool program Tech Time, which is supported by the school and the University of Illinois and had a demo at their tech fair.&#xD;
&#xD;
Weve all seen the wonders Mathematica and the Wolfram Language have at the industry, university and high school levels. But what about at the elementary level? &#xD;
&#xD;
The school had access to four Raspberry Pis and with the help of the University, was able to get all the necessary hardware accessories. For the after-school program, I created a series of lesson plans (lasting about 30 minutes each) to introduce the Wolfram Language and get them familiar with the Raspberry Pi. The overall goal of the lessons was to use the RPi with a button and LED light to blink Morse Code.&#xD;
&#xD;
Overview of the lessons&#xD;
&#xD;
 - Lesson 1: Introducing the Raspberry Pi&#xD;
 - Lesson 2: Powering up and Introducing Wolfram Language&#xD;
 - Lesson 3: Working with Variables&#xD;
 - Lesson 4: Connecting to hardware&#xD;
 - Lesson 5: Morse Code with the Raspberry Pi&#xD;
&#xD;
Here is the setup that I used on the Raspberry Pi:&#xD;
![morse code setup for raspberry pi and wolfram language][2]&#xD;
&#xD;
Since I was working with elementary school students (4th and 5th graders mostly), I felt the notebook interface was more forgiving than the command line. I had them start up Mathematica as root in order for Mathematica to be able to communicate with the hardware.&#xD;
&#xD;
In the command line, start Mathematica as root:&#xD;
&#xD;
    sudo mathematica&#xD;
&#xD;
You can see in the attached notebooks what I used for the various lessons, but to blink the LED light on and off, I used the following:&#xD;
&#xD;
    n= True;&#xD;
    While[n,&#xD;
    	inv=DeviceRead[GPIO,4];&#xD;
    	While[inv=={4-&amp;gt;1},inv=DeviceRead[GPIO,4]];&#xD;
    	DeviceWrite[GPIO,17-&amp;gt;1];&#xD;
    	While[inv=={4-&amp;gt;0},inv=DeviceRead[GPIO,4]];&#xD;
    	DeviceWrite[GPIO,17-&amp;gt;0];&#xD;
    ];&#xD;
    &#xD;
Although there were some minor user errors that we needed to troubleshoot, the afterschool workshops went great. Students didnt seem to have any problems with the language and wanted to know more. It was great to see them coming up with words to turn into Morse Codewords generally focused on the mildly crude :)&#xD;
&#xD;
For each lesson, I briefly went over what each function did in the notebook.  My main focus was to get them playing around with the language. For example in lesson 2, I had the students practice evaluating code by taking their name and jumbling the letters. They loved doing this and kept putting in different words to see what the outcome would beagain words generally focused on the mildly crude.&#xD;
&#xD;
In addition to the weekly afterschool workshops, Kenwood held a Saturday Tech fair at the school and invited me to participate. The added challenge to this arrangement was that the students I was going to be working with had never used a text-based programming language or seen a Raspberry Pi before.&#xD;
&#xD;
For the event, I set up two demos: Talk in Morse Code and Create a Stop Motion Video. My strategy was to have the kids use printouts of the Fritzing diagrams to put-together the Raspberry Pi accessories and then have them evaluate Wolfram Language code snippets. &#xD;
&#xD;
Here was my setup:&#xD;
&#xD;
![tech fair demo station for raspberry pi and wolfram language][3]&#xD;
&#xD;
I used the same setup for the Morse Code demo as I did in the workshop. Here is the setup for the stop motion demo:&#xD;
![stop motion setup for raspberry pi and wolfram language][4]&#xD;
&#xD;
Not going to lie, I was a little nervous. But low and behold, I was completely busy for the full two hours showing students and parents how to setup the Raspberry Pi and to run the code. The parents were excited to see their children getting their hands dirty with the hardware and the software. They were especially impressed to see that only in a short amount of code you could generate a movie!&#xD;
&#xD;
I had a second graders setup the RPi, run the code I had for them, then start asking questions about changing the code! It was fantastic. Here are some students working on the Morse Code setup. A moment before, the one in the pink was giving me the I got thisdont need your help look. :)&#xD;
&#xD;
![kenwood students with the raspberry pi and wolfram language][5]&#xD;
&#xD;
Here is the code that I used for the Stop Motion demo. Again first running Mathematica as root in the command line:&#xD;
&#xD;
    sudo mathematica&#xD;
&#xD;
To get the first image for the movie:&#xD;
&#xD;
    listImages={DeviceRead[&amp;#034;RaspiCam&amp;#034;]}&#xD;
&#xD;
Get the additional images for the movie:&#xD;
&#xD;
    n=1;&#xD;
    &#xD;
    While[n&amp;lt;5,&#xD;
    input=DeviceRead[&amp;#034;GPIO&amp;#034;,4];&#xD;
    While[input=={4-&amp;gt;1},input=DeviceRead[&amp;#034;GPIO&amp;#034;,4]];&#xD;
    image=DeviceRead[&amp;#034;RaspiCam&amp;#034;];&#xD;
    listImages=Append[listImages,image];&#xD;
    While[input=={4-&amp;gt;0},input=DeviceRead[&amp;#034;GPIO&amp;#034;,4]];&#xD;
    n++;&#xD;
    ]&#xD;
    &#xD;
To see the list of images:&#xD;
&#xD;
    listImages&#xD;
&#xD;
To turn the images into a movie, use the function ListAnimate:&#xD;
&#xD;
    ListAnimate[listImages]&#xD;
&#xD;
[Here][6] is a link to download all my materials.  The workshop materials (5 lessons worth) in addition to the materials I used for the 2 demos at the Tech fair. &#xD;
&#xD;
Im a teacher first and a programmer second, so any comments that you have to improve or extend the activity would be fantastic!&#xD;
&#xD;
I had a great time at Kenwood and hope to have another opportunity like this in the future. Also wanted to give a special thanks to the folks at Kenwood Elementary school and the University of Illinois for all their help.&#xD;
&#xD;
&#xD;
  [1]: http://www.champaignschools.org/schools/home/?id=15&#xD;
  [2]: /c/portal/getImageAttachment?filename=image1.png&amp;amp;userId=21124&#xD;
  [3]: /c/portal/getImageAttachment?filename=image2.png&amp;amp;userId=21124&#xD;
  [4]: /c/portal/getImageAttachment?filename=image3.png&amp;amp;userId=21124&#xD;
  [5]: /c/portal/getImageAttachment?filename=image4.png&amp;amp;userId=21124&#xD;
  [6]: http://wolfr.am/299nDmYx</description>
    <dc:creator>Adriana O&amp;#039;Brien</dc:creator>
    <dc:date>2014-12-15T17:38:03Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/453169">
    <title>Wolfram Data Drop and the Raspberry Pi for Education</title>
    <link>https://community.wolfram.com/groups/-/m/t/453169</link>
    <description>Yesterday, [Stephen Wolfram announced][1] the release of this great solution for storing and sharing data coming from sensors, devices, programs, humans or anything else: the [Wolfram Data Drop][2]. I think this is a turning point; it completely changes the game on how I&amp;#039;ve been interacting with streams of data. In this post I want to share with you three ideas that I&amp;#039;ve been exploring using the [Raspberry Pi 2][3], which, by the way, it runs Mathematica about 10x faster than its predecessors!&#xD;
&#xD;
![Data Drop on the Raspberry Pi][4]&#xD;
![enter image description here][5]&#xD;
![enter image description here][6]&#xD;
&#xD;
The first idea that came to my mind was to revisit some of the experiments I had carried out in the past, like this [home alarm system][7]. In a matter of minutes, I was able to set up an activity tracker for my home&amp;#039;s hall. Every time I pass by, the [PIR motion sensor][8] adds a 1 to the &amp;#034;mov&amp;#034; variable that is being dropped to [this databin][9] every 20 minutes. Check it out in [W|A][10], it&amp;#039;s live and growing! [== Data drop 3v1UbpOM][11]&#xD;
&#xD;
![My home hall&amp;#039;s activity][12]&#xD;
![Periodic entries][13]&#xD;
&#xD;
This is such a great thing, that dataset is just about me but it could be monitoring whatever you want, like your cat&amp;#039;s crazy habits. For this example the data is being logged periodically but you could set it up in an event-based manner. Like here, whenever a movement is detected, it triggers the [RaspiCam][15] and it sends the snapshot to the following databin:&#xD;
&#xD;
![RaspiCam Databin][16]&#xD;
![Cumulative Activity Plot][17]&#xD;
&#xD;
What about making a several days long time-lapse of you?&#xD;
&#xD;
![Daily work at home][18]&#xD;
&#xD;
Or what about using other sensors? The possibilities are just endless!&#xD;
&#xD;
Finally, let me end up with the following collaborative activity for the classroom. Here is how you can carry it out. &#xD;
&#xD;
First, you create a public databin to add two different names of animals that the students will enter:&#xD;
&#xD;
    CreateDatabin[ &amp;#034;Interpretation&amp;#034; -&amp;gt; {&amp;#034;animal1&amp;#034; -&amp;gt; &amp;#034;Animal&amp;#034;, &amp;#034;animal2&amp;#034; -&amp;gt; &amp;#034;Animal&amp;#034;}, &amp;lt;|&amp;#034;Name&amp;#034; -&amp;gt; &amp;#034;Classroom Zoo&amp;#034;|&amp;gt;]&#xD;
&#xD;
Then, ask your students to submit their favorite animals&amp;#039; names, using the web-based platform http://wolfr.am/3zCzVgPJ&#xD;
&#xD;
![Add new entry][19]&#xD;
&#xD;
Their individual entries will ended up generating things similar to this amazing [Graph][20]!&#xD;
&#xD;
    data = Values[Databin[&amp;#034;3zCzVgPJ&amp;#034;]];&#xD;
    pairs = Apply[Rule, Drop[Transpose[{data[&amp;#034;animal1&amp;#034;], data[&amp;#034;animal2&amp;#034;]}], 9], {1}]&#xD;
![Name pairs][21]&#xD;
&#xD;
    pics = Map[# -&amp;gt; #[&amp;#034;Image&amp;#034;] &amp;amp;, Union[Flatten[Drop[Transpose[{data[&amp;#034;animal1&amp;#034;], data[&amp;#034;animal2&amp;#034;]}], 9]]]];&#xD;
    style ={ VertexSize-&amp;gt;1.2,EdgeStyle-&amp;gt;Directive[Arrowheads[{{.02,.6}}],Hue[.4,1,.3]],VertexShape-&amp;gt;pics};&#xD;
    Graph[pairs, style, ImageSize -&amp;gt; 900]&#xD;
![Animals Graph][22]&#xD;
&#xD;
Please, give [it a try][23]. Later, we will see what the giant graph ends up looking like. Or even more fun, share with us your ideas or databins that you want to be filled out collaboratively!&#xD;
&#xD;
&#xD;
  [1]: http://blog.wolfram.com/2015/03/04/the-wolfram-data-drop-is-live/&#xD;
  [2]: https://datadrop.wolframcloud.com/&#xD;
  [3]: http://www.wolfram.com/raspberry-pi/&#xD;
  [4]: /c/portal/getImageAttachment?filename=RaspberryPi_Data_Drop.png&amp;amp;userId=56204&#xD;
  [5]: /c/portal/getImageAttachment?filename=raspberry-pi-01.png&amp;amp;userId=56204&#xD;
  [6]: /c/portal/getImageAttachment?filename=raspberry-pi-02.png&amp;amp;userId=56204&#xD;
  [7]: http://community.wolfram.com/groups/-/m/t/226163&#xD;
  [8]: http://www.adafruit.com/products/189&#xD;
  [9]: http://wolfr.am/3v1UbpOM&#xD;
  [10]: https://www.wolframalpha.com/input/?i=Data%20drop%203v1UbpOM&#xD;
  [11]: https://www.wolframalpha.com/input/?i=Data%20drop%203v1UbpOM&#xD;
  [12]: /c/portal/getImageAttachment?filename=activity.png&amp;amp;userId=56204&#xD;
  [13]: /c/portal/getImageAttachment?filename=entries.png&amp;amp;userId=56204&#xD;
  [14]: /c/portal/getImageAttachment?filename=MyHomeHallActivity.png&amp;amp;userId=56204&#xD;
  [15]: http://community.wolfram.com/groups/-/m/t/157704&#xD;
  [16]: /c/portal/getImageAttachment?filename=HomeActivity.jpg&amp;amp;userId=56204&#xD;
  [17]: /c/portal/getImageAttachment?filename=cumulativePlot.png&amp;amp;userId=56204&#xD;
  [18]: /c/portal/getImageAttachment?filename=timelapse_Bernat.gif&amp;amp;userId=56204&#xD;
  [19]: /c/portal/getImageAttachment?filename=inputDD.png&amp;amp;userId=56204&#xD;
  [20]: http://reference.wolfram.com/language/ref/Graph.html&#xD;
  [21]: /c/portal/getImageAttachment?filename=ClassZoo.jpg&amp;amp;userId=56204&#xD;
  [22]: /c/portal/getImageAttachment?filename=AnimalsGraph.jpg&amp;amp;userId=56204&#xD;
  [23]: http://wolfr.am/3zCzVgPJ</description>
    <dc:creator>Bernat Espigulé</dc:creator>
    <dc:date>2015-03-05T15:01:41Z</dc:date>
  </item>
</rdf:RDF>

