<?xml version="1.0" encoding="UTF-8"?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns="http://purl.org/rss/1.0/" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel rdf:about="https://community.wolfram.com">
    <title>Community RSS Feed</title>
    <link>https://community.wolfram.com</link>
    <description>RSS Feed for Wolfram Community showing ideas tagged with Augmented and Virtual Realities sorted by active.</description>
    <items>
      <rdf:Seq>
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/3503134" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/2292400" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/2166965" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/2143320" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/2097827" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/1810945" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/1801300" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/1383237" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/1379131" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/1369571" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/1305281" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/879610" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/772917" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/908742" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/814988" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/230347" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/96837" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/90931" />
        <rdf:li rdf:resource="https://community.wolfram.com/groups/-/m/t/35026" />
      </rdf:Seq>
    </items>
  </channel>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/3503134">
    <title>[WSRP25] Helping hand to Africa:prosthetic hand&amp;#039;s kinetic movement &amp;amp; force-closure grasping analysis</title>
    <link>https://community.wolfram.com/groups/-/m/t/3503134</link>
    <description>![Helping hand to Africa:prosthetic hand&amp;#039;s kinetic movement &amp;amp; force-closure grasping analysis][1]&#xD;
&#xD;
&#xD;
&amp;amp;[Wolfram Notebook][2]&#xD;
&#xD;
&#xD;
  [1]: https://community.wolfram.com//c/portal/getImageAttachment?filename=Screenshot2025-07-10at4.09.36%E2%80%AFPM.png&amp;amp;userId=3501263&#xD;
  [2]: https://www.wolframcloud.com/obj/b1e5f73f-31a2-405b-96fa-122564a24619</description>
    <dc:creator>Eunchan Hwang</dc:creator>
    <dc:date>2025-07-10T22:07:54Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/2292400">
    <title>UnityLink: gotcha&amp;#039;s for beginners. 17.6.21</title>
    <link>https://community.wolfram.com/groups/-/m/t/2292400</link>
    <description>TL/DR Use Unity 2019.4.29f1 with Wolfram 12.3 unityLink&#xD;
&#xD;
Hello all.&#xD;
&#xD;
This will probably turn out to be a mistake, but I thought I&amp;#039;d post whatever nuggets I discover trying to get Wolfram and Unity to play nice.&#xD;
&#xD;
So far, here&amp;#039;s what I know.&#xD;
&#xD;
Mathematica 12.3. I&amp;#039;ve been struggling with this for over a week now, and I could not get UnityLink to work in a stable fashion unless I rolled back to Unity 2019.4.28 f1. Now that I&amp;#039;ve done that, I&amp;#039;m back to having fun in Wolfram with Unity.&#xD;
&#xD;
I&amp;#039;m enclosing the &amp;#034;Spikey&amp;#034; Notebook for you to try yourselves, but if you get it to compile all the way to the end, you&amp;#039;re in a good place ce with your setup. AFIK.&#xD;
&amp;amp;[Wolfram Notebook][1]&#xD;
&#xD;
&#xD;
  [1]: https://www.wolframcloud.com/obj/3180afd1-447d-4abe-92f5-28dc604c4f4e</description>
    <dc:creator>duncan shepherd</dc:creator>
    <dc:date>2021-06-17T20:27:06Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/2166965">
    <title>3D AR Graphing with Wolfram</title>
    <link>https://community.wolfram.com/groups/-/m/t/2166965</link>
    <description>Hi everyone! I made a website that uses Wolfram&amp;#039;s client library for python in order to make 3D models of an inputted equation and view them in AR! Check it out here:  &#xD;
https://devpost.com/software/3d-ar-grapher  &#xD;
Does anyone have any suggestions or improvements I could make?</description>
    <dc:creator>Richard Sbaschnig</dc:creator>
    <dc:date>2021-01-18T22:10:17Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/2143320">
    <title>Traveling around  Jupiter and Saturn</title>
    <link>https://community.wolfram.com/groups/-/m/t/2143320</link>
    <description>![Figure 2][5]&#xD;
![Figure 3][6]&#xD;
&#xD;
&#xD;
Since we have historical conjunction of Jupiter and Saturn I have prepared code with visualization planet Jupiter and Saturn with moons. First code also published [here][3]. Visualization of  planet Jupiter:&#xD;
&#xD;
    JSP = QuantityMagnitude[&#xD;
       UnitConvert[&#xD;
         PlanetaryMoonData[{&amp;#034;Io&amp;#034;, &amp;#034;Europa&amp;#034;, &amp;#034;Ganymede&amp;#034;, &#xD;
           &amp;#034;Callisto&amp;#034;}, {&amp;#034;OrbitPeriod&amp;#034;, &amp;#034;SemimajorAxis&amp;#034;, &amp;#034;Radius&amp;#034;}], &#xD;
         &amp;#034;SI&amp;#034;] /. &#xD;
        q : Quantity[_, &amp;#034;Days&amp;#034; | &amp;#034;Hours&amp;#034;] :&amp;gt; UnitConvert[q, &amp;#034;Seconds&amp;#034;]];&#xD;
    a = Pi*RandomReal[{0, 2}, Length[JSP]];&#xD;
    b = 31557600*&#xD;
       QuantityMagnitude[Entity[&amp;#034;Planet&amp;#034;, &amp;#034;Jupiter&amp;#034;][&amp;#034;OrbitPeriod&amp;#034;]]*&#xD;
       Table[1/JSP[[i, 1]], {i, 1, 4}];&#xD;
    R = Table[JSP[[i, 2]], {i, 1, 4}];&#xD;
    c = Table[JSP[[i, 3]], {i, 1, 4}];&#xD;
    k = 3;&#xD;
    radius = QuantityMagnitude[&#xD;
       Entity[&amp;#034;Planet&amp;#034;, &amp;#034;Jupiter&amp;#034;][&amp;#034;EquatorialRadius&amp;#034;], &amp;#034;Kilometers&amp;#034;];&#xD;
    radius1 = &#xD;
      QuantityMagnitude[Entity[&amp;#034;Planet&amp;#034;, &amp;#034;Jupiter&amp;#034;][&amp;#034;PolarRadius&amp;#034;], &#xD;
       &amp;#034;Kilometers&amp;#034;];&#xD;
    Xoblateness = Entity[&amp;#034;Planet&amp;#034;, &amp;#034;Jupiter&amp;#034;][&amp;#034;Oblateness&amp;#034;];&#xD;
    Xobliquity = Entity[&amp;#034;Planet&amp;#034;, &amp;#034;Jupiter&amp;#034;][&amp;#034;Obliquity&amp;#034;];&#xD;
    distanse = &#xD;
      QuantityMagnitude[&#xD;
       Entity[&amp;#034;Planet&amp;#034;, &amp;#034;Jupiter&amp;#034;][&amp;#034;AverageOrbitDistance&amp;#034;], &amp;#034;Kilometers&amp;#034;];&#xD;
    angularvelocity = &#xD;
      Entity[&amp;#034;Planet&amp;#034;, &amp;#034;Jupiter&amp;#034;][&amp;#034;EquatorialAngularVelocity&amp;#034;];&#xD;
    texture = &#xD;
      ImageReflect[&#xD;
       EntityValue[Entity[&amp;#034;Planet&amp;#034;, &amp;#034;Jupiter&amp;#034;], &#xD;
        &amp;#034;CylindricalEquidistantTexture&amp;#034;], Bottom];&#xD;
    &#xD;
    planet = ParametricPlot3D[{radius Cos[t] Sin[p], &#xD;
       radius Sin[t] Sin[p], (1 - Xoblateness) radius Cos[p]}, {t, 0, &#xD;
       2 Pi}, {p, 0, \[Pi]}, Mesh -&amp;gt; None, PlotStyle -&amp;gt; Texture[texture], &#xD;
      Lighting -&amp;gt; &amp;#034;Neutral&amp;#034;, Boxed -&amp;gt; False, Axes -&amp;gt; False, &#xD;
      PlotPoints -&amp;gt; 100]&#xD;
Visualization of travelling orbit (we solve restricted 3 body problem)&#xD;
&#xD;
    M1 = QuantityMagnitude[Entity[&amp;#034;Planet&amp;#034;, &amp;#034;Jupiter&amp;#034;][&amp;#034;Mass&amp;#034;]];&#xD;
    M2 = QuantityMagnitude[Entity[&amp;#034;Star&amp;#034;, &amp;#034;Sun&amp;#034;][&amp;#034;Mass&amp;#034;]];&#xD;
    Ra = QuantityMagnitude[&#xD;
        Entity[&amp;#034;Planet&amp;#034;, &amp;#034;Jupiter&amp;#034;][&amp;#034;EquatorialRadius&amp;#034;]]*10^3;&#xD;
    S = QuantityMagnitude[Entity[&amp;#034;Planet&amp;#034;, &amp;#034;Jupiter&amp;#034;][&amp;#034;SemimajorAxis&amp;#034;]];&#xD;
    ar = Ra/(S*149597870700);&#xD;
    m = M1/M2;&#xD;
    G = 6.67384*10^(-11); vS = &#xD;
     Sqrt[G*M1/Ra]; u0 = 16000; {ux, uy, &#xD;
      uz} = {0.28309789428056364`, -2.269693660804425`, &#xD;
      0.24765897137821516`}; tm = 2*0.007188414157797473;&#xD;
    {x0, y0, z0} = {0.9987450742768228`, -0.00006988474848081634`, \&#xD;
    -0.000015444353758731164`};&#xD;
    eq = {x&amp;#039;&amp;#039;[t] == &#xD;
        2*y&amp;#039;[t] + x[t] - (&#xD;
         m (-1 + m + x[t]))/((-1 + m + x[t])^2 + y[t]^2 + z[t]^2)^(&#xD;
         3/2) - ((1 - m) (m + x[t]))/((m + x[t])^2 + y[t]^2 + z[t]^2)^(&#xD;
         3/2), y&amp;#039;&amp;#039;[t] == -2*x&amp;#039;[t] + y[t] - (&#xD;
         m y[t])/((-1 + m + x[t])^2 + y[t]^2 + z[t]^2)^(&#xD;
         3/2) - ((1 - m) y[t])/((m + x[t])^2 + y[t]^2 + z[t]^2)^(3/2), &#xD;
       z&amp;#039;&amp;#039;[t] == -((m z[t])/((-1 + m + x[t])^2 + y[t]^2 + z[t]^2)^(&#xD;
          3/2)) - ((1 - m) z[t])/((m + x[t])^2 + y[t]^2 + z[t]^2)^(3/2), &#xD;
       x[0.] == x0, y[0.] == y0, x&amp;#039;[0.] == ux, y&amp;#039;[0.] == uy, z&amp;#039;[0.] == uz,&#xD;
        z[0.] == z0};&#xD;
    {xfun, yfun, zfun} = NDSolveValue[eq, {x, y, z}, {t, 0, tm}];&#xD;
    rt = RotationTransform[Xobliquity, {1, 0, 0}]; V = &#xD;
     rt[S*149597870.700*{xfun[t] - 1 + m, yfun[t], zfun[t]}];&#xD;
    &#xD;
    {Orbit = ParametricPlot3D[V, {t, 0, tm}], Plot[Norm[V], {t, 0, tm}]}&#xD;
    &#xD;
    Show[{Graphics3D[Orbit[[1]], Boxed -&amp;gt; False], planet}]&#xD;
We have this picture with Jupiter and orbit around&#xD;
![Figure 1][4]&#xD;
&#xD;
Now we combine orbit, planet and moons in one scene, and export it as a gif (or animate).&#xD;
&#xD;
    Th = tm*31557600*&#xD;
       QuantityMagnitude[&#xD;
          Entity[&amp;#034;Planet&amp;#034;, &amp;#034;Jupiter&amp;#034;][&amp;#034;OrbitPeriod&amp;#034;]]/(3600)/(2*Pi);&#xD;
    &#xD;
    frames = Table[&#xD;
       Show[{Graphics3D[&#xD;
          Rotate[planet[[1]], 2*Pi*Th*t/9.841666666666667/tm, {0, 0, 1}], &#xD;
          Boxed -&amp;gt; False, Axes -&amp;gt; False, ImageSize -&amp;gt; .25 {1920, 1080}], &#xD;
         Table[Graphics3D[{White, &#xD;
            Sphere[{R[[i]]*Cos[a[[i]] + b[[i]]*t], &#xD;
              R[[i]]*Sin[a[[i]] + b1[[i]]*t], 0}, k*c[[i]]]}, &#xD;
           Background -&amp;gt; Black, Boxed -&amp;gt; False, PlotRange -&amp;gt; All], {i, 1, &#xD;
           4}]}, Background -&amp;gt; Black, SphericalRegion -&amp;gt; True, &#xD;
        ViewAngle -&amp;gt; Pi/4, PlotRange -&amp;gt; All, &#xD;
        Lighting -&amp;gt; {{&amp;#034;Ambient&amp;#034;, GrayLevel[0.05]}, {&amp;#034;Point&amp;#034;, &#xD;
           White, {20 radius, 0, 0}}}, ViewVector -&amp;gt; {V, {0, 0, 0}}], {t, &#xD;
        0, (1 - .0025)*tm, .0025*tm}];&#xD;
    Export[&amp;#034;C:\\...\\Jupiter.gif&amp;#034;, frames, AnimationRepetitions -&amp;gt; Infinity]&#xD;
&#xD;
![Figure 2][5]&#xD;
&#xD;
Second code with visualization of Saturn and moons  &amp;#034;Mimas&amp;#034;, &amp;#034;Enceladus&amp;#034;, &amp;#034;Tethys&amp;#034;, &amp;#034;Dione&amp;#034;, &amp;#034;Rhea&amp;#034;, &amp;#034;Titan&amp;#034;  is differ from above since we need to show rings and also orbit around Saturn is not like orbit around Jupiter. The code is in notebook attached. And animation looks like this one &#xD;
 &#xD;
![Figure 3][6]&#xD;
&#xD;
&#xD;
  [1]: https://community.wolfram.com//c/portal/getImageAttachment?filename=ezgif-7-e64479f927ce.gif&amp;amp;userId=20103&#xD;
  [2]: https://community.wolfram.com//c/portal/getImageAttachment?filename=ezgif-7-b5d492aca470.gif&amp;amp;userId=20103&#xD;
  [3]: https://mathematica.stackexchange.com/questions/236843/travelling-around-jupiter?noredirect=1#comment598213_236843&#xD;
  [4]: https://community.wolfram.com//c/portal/getImageAttachment?filename=JupiterOrbit.jpg&amp;amp;userId=1218692&#xD;
  [5]: https://community.wolfram.com//c/portal/getImageAttachment?filename=JupiterMF4.gif&amp;amp;userId=1218692&#xD;
  [6]: https://community.wolfram.com//c/portal/getImageAttachment?filename=SaturnFM5.gif&amp;amp;userId=1218692</description>
    <dc:creator>Alexander Trounev</dc:creator>
    <dc:date>2020-12-20T22:28:26Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/2097827">
    <title>3d Compare past and future generations of the Wolfram models</title>
    <link>https://community.wolfram.com/groups/-/m/t/2097827</link>
    <description>**[View this post in Wolfram Cloud Notebook format][1]**&#xD;
&#xD;
Viewing events in 3d on node level to compare the past and future generations of the Wolfram models&#xD;
========================================================================&#xD;
&#xD;
We created a new feature to the gigabrain.io physics 3d explorer. It features a comparison between final generations of the current, previous and future generation&#xD;
&#xD;
&amp;gt;![Wolfram model 6721 progress][2]&amp;lt;br&amp;gt;&#xD;
**[Video animation of 6721 progress in 50 generations][3]**&amp;lt;br&amp;gt;&#xD;
[Click to see model 6721 generation 8 comparison in 3d][4]&#xD;
&#xD;
&amp;gt;![Wolfram model 44586 progress][5]&amp;lt;br&amp;gt;&#xD;
**[Video animation of 44586 progress in 100 generations][6]**&amp;lt;br&amp;gt;&#xD;
[Click to see model 44586 generation 1 comparison in 3d][7]&#xD;
&#xD;
Color code explanations in the 3d viewer of the generation comparison&#xD;
---------------------------------------------------------------------&#xD;
&#xD;
The 3d viewer&amp;#039;s node colors are coded as following:&#xD;
&#xD;
 - Red = Node is found only in previous generation&#xD;
 - Light red = Node is found in previous and current but not the next generation&#xD;
 - Blue Magenta = Node is found in all 3 previous, current and next generation&#xD;
 - Light green = Node is found in current and next but not the previous generation &#xD;
 - Green = Node is found only in next generation &#xD;
&#xD;
Wolfram model final states as image list&#xD;
----------------------------------------&#xD;
&#xD;
Below is a comparison of the event states in Wolfram SetReplace project.&#xD;
Here we use the Wolfram Model wm3147.&#xD;
&#xD;
Let&amp;#039;s compare the images with the final states using the StatesPlotsList.&#xD;
&#xD;
    ResourceFunction[&amp;#034;WolframModel&amp;#034;][{{{1, 1, 2}, {3, 4, 5}} -&amp;gt; {{3, 3, 6}, {1, 3, 2}, {7, 1, 3}, {7, 8, 2}}}, {{1, 1, 1}, {1, 1, 1}}, 4 ][ &amp;#034;StatesPlotsList&amp;#034;,VertexLabels -&amp;gt; Automatic,  ImageSize -&amp;gt; 100]&#xD;
![Generations output][8]&#xD;
&#xD;
Comparing generations in 3d viewer&#xD;
----------------------------------&#xD;
&#xD;
**[Click to See model 3147 generation 3 comparison in 3d viewer][9]**&#xD;
&#xD;
To compare the generation 3 with the 3d viewer we can use the link below and we get the image below. In the user interface you can change the mode by clicking on the mode button. Default mode is for a single graph the next mode is the generation comparison and the third mode is to download the graph as a OBJ 3d file.&#xD;
![3d Generation comparison][10]&#xD;
&#xD;
In this graph the Current generation is 3 and previous is 2 and next is 4. For reference the generations are below.&#xD;
Magenta Nodes 1, 2 and 3 are found in generations 2, 3 and 4.&#xD;
Light red Node 4  is not in the generation 4 but can be found in 3 and 2. &#xD;
Light green Nodes 5, 6 and 7 are created in current generation and also present in next generation.&#xD;
Green Nodes 8, 9, 10, 11, 12 and 13 are found only in generation 4.&#xD;
&#xD;
![Generations 2 3 4][11]&#xD;
&#xD;
**[Click to See model 3147 generation 4 comparison in 3d viewer][12]**&#xD;
&#xD;
When changing the view to the next generation 4 we get the image below.&#xD;
![3d Generation comparison 2][13]&#xD;
&#xD;
In this graph the current generation is 4 and previous is 3 and next is 5 . For reference the generations are below.&#xD;
Magenta Nodes 1, 2, 3, 5, 6 are found in generations 3, 4 and 5.&#xD;
Light red Node 7  is not in the generation 5 but can be found in 4 and 3. &#xD;
Light green Nodes 8,9,10,11,12,13 are created in current generation 4 and also present in next generation 5.&#xD;
Green Nodes 14,15,16,17,18,19 are found only in the next generation 5.&#xD;
&#xD;
![Generations 3 4 5][14]&#xD;
&#xD;
The final generation comparison does not take account the single events that produce the final states. We can see a full comparison of the events that produced the states with the EventsStatesPlotsList. &#xD;
&#xD;
The documentation says about the graphs: Here the dotted gray edges are the ones about to be deleted, whereas the red ones have just been created.&#xD;
&#xD;
    ResourceFunction[&amp;#034;WolframModel&amp;#034;][{{{1, 1, 2}, {3, 4, 5}} -&amp;gt; {{3, 3, 6}, {1, 3, 2}, {7, 1, 3}, {7, 8, 2}}}, {{1, 1, 1}, {1, 1, 1}}, 4 ][ &amp;#034;EventsStatesPlotsList&amp;#034;,VertexLabels -&amp;gt; Automatic,  ImageSize -&amp;gt; 100]&#xD;
![Event states][15]&#xD;
&#xD;
Events to produce the generation 3. Nodes with red lines are created.&#xD;
&#xD;
![Event State 1][16]&#xD;
&#xD;
Events to produce the generation 4. Here we see the removal of the node 4 with two dotted gray lines.&#xD;
&#xD;
![Event State 2][17]&#xD;
&#xD;
Events to produce the generation 5. Here we see the removal of the node 7 with two dotted gray lines.&#xD;
&#xD;
![Event State 3][18]&#xD;
&#xD;
Conclusion&#xD;
==========&#xD;
&#xD;
As a conclusion this shows that the comparison of the final generations can visually show in what areas the new nodes are created and in what areas the nodes are being deleted. But it does not show the full event graph of how the events are produced.&#xD;
&#xD;
Problems&#xD;
========&#xD;
&#xD;
 - The viewer does not show events differently that have first removed a node and later on added the node again in the same generation. It only compares the differences of the final generations.&#xD;
 - Currently the 3d viewer does not handle all types of Wolfram models, like single-element models wm161, model with multiple rules wm4486&#xD;
 - The model calculation is done by Javascript and does not use Wolfram language. By the cause of this the rule choosing is not available in the 3d viewer and the resulting event numbering might differ.&#xD;
 - We use the name Wolfram model as defined in SetReplace Project: &amp;#034;A more interesting case (which we call a Wolfram model) is one where the set elements are related to each other. Specifically, we can consider a set of ordered lists of atomic vertices; in other words, an ordered hypergraph. &amp;#034; Although the implementation is made currently with Javascript&amp;#039;s own implementation and not the SetReplace package. &#xD;
&#xD;
References&#xD;
========&#xD;
&#xD;
 - Wolfram Model Explorer: Zoli Kahan, [https://community.wolfram.com/groups/-/m/t/1985729][19] [https://github.com/Zolmeister/wolfram-model-explorer][20]&#xD;
 - SetReplace: Max Piskunov, [https://github.com/maxitg/SetReplace/][21] [PlotsOfEvents][22]&#xD;
 - Registry of Notable Universe Models: [https://www.wolframphysics.org/universes/wm3147/][23]&#xD;
&#xD;
&#xD;
  [1]: https://www.wolframcloud.com/obj/tuomas0/Published/Compare_Generations_in_3d_wm3147.nb&#xD;
  [2]: https://community.wolfram.com//c/portal/getImageAttachment?filename=wm6721Animation50generations.gif&amp;amp;userId=2053745&#xD;
  [3]: https://gigabrain.s3-eu-west-1.amazonaws.com/physics/wm6721+Animation+50+generations.mp4&#xD;
  [4]: http://gigabrain.io/physics/index.html?rule=%7B%7B%7B1,1,2%7D,%7B3,2,4%7D%7D-%3E%7B%7B5,5,3%7D,%7B3,1,5%7D,%7B4,2,4%7D%7D%7D&amp;amp;init=%7B%7B1,1,1%7D,%7B1,1,1%7D%7D&amp;amp;steps=8&amp;amp;modelId=556&amp;amp;modeType=1&#xD;
  [5]: https://community.wolfram.com//c/portal/getImageAttachment?filename=44586generation100animation.gif&amp;amp;userId=2053745&#xD;
  [6]: https://gigabrain.s3-eu-west-1.amazonaws.com/physics/wm44586+Animation+100+generations.mp4&#xD;
  [7]: http://gigabrain.io/physics/index.html?rule=%7B%7B%7B1,2,3%7D,%7B4,2,5%7D%7D-%3E%7B%7B6,3,1%7D,%7B3,6,4%7D,%7B1,2,6%7D%7D%7D&amp;amp;init=%7B%7B1,1,1%7D,%7B1,1,1%7D%7D&amp;amp;steps=1&amp;amp;modelId=335&amp;amp;modeType=1&#xD;
  [8]: https://community.wolfram.com//c/portal/getImageAttachment?filename=Generationsout1.png&amp;amp;userId=2053745&#xD;
  [9]: http://gigabrain.io/physics/index.html?rule=%7B%7B%7B1,1,2%7D,%7B3,4,5%7D%7D-%3E%7B%7B3,3,6%7D,%7B1,3,2%7D,%7B7,1,3%7D,%7B7,8,2%7D%7D%7D&amp;amp;init=%7B%7B1,1,1%7D,%7B1,1,1%7D%7D&amp;amp;steps=3&amp;amp;modelId=193&amp;amp;modeType=1&#xD;
  [10]: https://community.wolfram.com//c/portal/getImageAttachment?filename=gigabrainGeneration3comparison.png&amp;amp;userId=2053745&#xD;
  [11]: https://community.wolfram.com//c/portal/getImageAttachment?filename=Generations1.png&amp;amp;userId=2053745&#xD;
  [12]: http://gigabrain.io/physics/index.html?rule=%7B%7B%7B1,1,2%7D,%7B3,4,5%7D%7D-%3E%7B%7B3,3,6%7D,%7B1,3,2%7D,%7B7,1,3%7D,%7B7,8,2%7D%7D%7D&amp;amp;init=%7B%7B1,1,1%7D,%7B1,1,1%7D%7D&amp;amp;steps=4&amp;amp;modelId=193&amp;amp;modeType=1&#xD;
  [13]: https://community.wolfram.com//c/portal/getImageAttachment?filename=gigabrainGeneration4comparison.png&amp;amp;userId=2053745&#xD;
  [14]: https://community.wolfram.com//c/portal/getImageAttachment?filename=Generations2.png&amp;amp;userId=2053745&#xD;
  [15]: https://community.wolfram.com//c/portal/getImageAttachment?filename=Eventsout1.png&amp;amp;userId=2053745&#xD;
  [16]: https://community.wolfram.com//c/portal/getImageAttachment?filename=Eventsout2.png&amp;amp;userId=2053745&#xD;
  [17]: https://community.wolfram.com//c/portal/getImageAttachment?filename=Eventsout3.png&amp;amp;userId=2053745&#xD;
  [18]: https://community.wolfram.com//c/portal/getImageAttachment?filename=Eventsout4.png&amp;amp;userId=2053745&#xD;
  [19]: https://community.wolfram.com/groups/-/m/t/1985729&#xD;
  [20]: https://github.com/Zolmeister/wolfram-model-explorer&#xD;
  [21]: https://github.com/maxitg/SetReplace/&#xD;
  [22]: https://github.com/maxitg/SetReplace/blob/master/Documentation/SymbolsAndFunctions/WolframModelAndWolframModelEvolutionObject/Properties/PlotsOfEvents.md &amp;#034;PlotsOfEvents&amp;#034;&#xD;
  [23]: https://www.wolframphysics.org/universes/wm3147/</description>
    <dc:creator>Tuomas Sorakivi</dc:creator>
    <dc:date>2020-10-19T08:07:04Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/1810945">
    <title>Photorealistic Rendering: A walkthrough for working with LuxCoreRender</title>
    <link>https://community.wolfram.com/groups/-/m/t/1810945</link>
    <description>[![Cover Image][1]](https://community.wolfram.com//c/portal/getImageAttachment?filename=diamond_denoised.png&amp;amp;userId=93201)&#xD;
&#xD;
Couple of days ago, in Wolfram&amp;#039;s Wechat group there was a pleasant discussion about Tim&amp;#039;s neat post [**Ray-tracing Graphics3D with UnityLink**]( https://community.wolfram.com/groups/-/m/t/1801300). I then promised a walkthrough on how to work with [*LuxCoreRender*](http://www.luxcorerender.org/). I think it would be nice to share my experience not only in the Wechat group but also here to the Community.&#xD;
&#xD;
LuxCoreRender is a cross-platform render engine published under GPLv3 license. It provides nice features and both Python and C++ APIs. According to [the official site](https://luxcorerender.org/goals/), it&amp;#039;s a *physically correct, unbiased* rendering engine. Usually LuxCoreRender can be accessed as a plugin in many popular 3D tools such as Blender, but in this post I&amp;#039;ll talk about the way more suitable for scripting from Mathematica&amp;#039;s side.&#xD;
&#xD;
First of all, let&amp;#039;s set our working directory.&#xD;
&#xD;
```&#xD;
SetDirectory[NotebookDirectory[]];&#xD;
```&#xD;
&#xD;
## Import and Export LuxCoreRender SDL files&#xD;
&#xD;
Readers can safely skip this section. We&amp;#039;ll just dump our import and export functions for the LuxCoreRender SDL files here. Please use them with caution. Though I have tested them with some examples, edge cases can bite.&#xD;
&#xD;
```&#xD;
ClearAll[branch]&#xD;
branch = Through@*{##} &amp;amp; ;&#xD;
```&#xD;
```&#xD;
ClearAll[luxcoreReaderInternalStep, luxcoreReader]&#xD;
&#xD;
luxcoreReaderInternalStep[e : {__List}] := e // RightComposition[&#xD;
     Map[If[Length[#] == 1, #[[1]], #] &amp;amp;],&#xD;
     GroupBy[ListQ],&#xD;
     If[KeyExistsQ[#, True], # // MapAt[GroupBy[First -&amp;gt; Rest] /* Map[luxcoreReaderInternalStep], Key[True]], #] &amp;amp;,&#xD;
     If[KeyExistsQ[#, False], # // MapAt[Apply[Sequence], Key[False]], #] &amp;amp;,&#xD;
     Values,&#xD;
     Switch[#,&#xD;
            {_Association}, #[[1]],&#xD;
            _, #&#xD;
            ] &amp;amp;&#xD;
     ]&#xD;
&#xD;
luxcoreReader =&#xD;
 Module[{numReader = ImportString[#, &amp;#034;Table&amp;#034;][[1, 1]] &amp;amp;},&#xD;
        RightComposition[&#xD;
              StringReplace[(StartOfLine ~~ {Whitespace, &amp;#034;#&amp;#034; ~~ Except[&amp;#034;\n&amp;#034;] ...} ... ~~ &amp;#034;\n&amp;#034;) .. :&amp;gt; &amp;#034;&amp;#034;],&#xD;
              StringTrim,&#xD;
              StringSplitNested[#, {&amp;#034;\n&amp;#034;, Whitespace ~~ &amp;#034;=&amp;#034; ~~ Whitespace}] &amp;amp;,&#xD;
              MapAt[StringTrim, {;; , ;;}],&#xD;
              MapAt[StringSplit[#, &amp;#034;.&amp;#034;] &amp;amp;, {;; , 1}],&#xD;
              Map[Flatten],&#xD;
              luxcoreReaderInternalStep,&#xD;
              Map[Last, #, {-2}] &amp;amp;,&#xD;
              Map[&#xD;
                  Switch[#,&#xD;
                         _?(StringMatchQ[&amp;#034;\&amp;#034;&amp;#034; ~~ ___ ~~ &amp;#034;\&amp;#034;&amp;#034;])                    , StringTake[#, {2, -2}],&#xD;
                         _?(Not@*StringFreeQ[Whitespace])                         , StringSplit[#, Whitespace] // Map[#0],&#xD;
                         _?(StringMatchQ[{DigitCharacter, &amp;#034;.&amp;#034;, &amp;#034;+&amp;#034;, &amp;#034;-&amp;#034;, &amp;#034;e&amp;#034;} ..]), numReader@#,&#xD;
                         _, #&#xD;
                         ] &amp;amp;&#xD;
                  , #, {-1}] &amp;amp;&#xD;
              ]&#xD;
       ];&#xD;
```&#xD;
```&#xD;
ClearAll[luxcoreWriter]&#xD;
&#xD;
luxcoreWriter =&#xD;
 Module[{&#xD;
         keychainCombiner = Cases[Key[key_] :&amp;gt; key] /* (Riffle[#, &amp;#034;.&amp;#034;] &amp;amp;),&#xD;
         stringifier = ExportString[#, &amp;#034;RawJSON&amp;#034;, &amp;#034;Compact&amp;#034; -&amp;gt; 0] &amp;amp;&#xD;
        },&#xD;
        RightComposition[&#xD;
               MapIndexed[Sow[{##}] &amp;amp;, #, {-1}] &amp;amp;, Reap, #[[2, 1]] &amp;amp;&#xD;
               , {#, Range[Length@#]}\[Transpose] &amp;amp;&#xD;
               , GroupBy[IntegerQ[#[[1, 2, -1]]] &amp;amp;]&#xD;
               ,(* atomic values: *)&#xD;
               MapAt[&#xD;
                     MapAt[Apply[{#1 // stringifier, #2 // keychainCombiner} &amp;amp; /* Apply[StringJoin[#2, &amp;#034; = &amp;#034;, #1] &amp;amp;]], {;; , 1}]&#xD;
                     , Key[False]]&#xD;
               ,(* numerical arrays: *)&#xD;
               If[# // KeyExistsQ[True]&#xD;
                  , # // MapAt[&#xD;
                    RightComposition[&#xD;
                           MapAt[Apply[{#2 // Most, {Last[#2], #1}} &amp;amp;], {;; , 1}]&#xD;
                           , GroupBy[(#[[1, 1]] &amp;amp;) -&amp;gt; ({#[[1, 2]], #[[2]]} &amp;amp;)]&#xD;
                           , Map@branch[&#xD;
                                  (#[[;; , 1, 2]] &amp;amp;) /* stringifier /* (StringReplace[{&amp;#034;,&amp;#034; -&amp;gt; &amp;#034; &amp;#034;, &amp;#034;[&amp;#034; | &amp;#034;]&amp;#034; :&amp;gt; &amp;#034;&amp;#034;}])&#xD;
                                  , #[[1, 2]] &amp;amp;&#xD;
                                  ]&#xD;
                           , KeyMap[keychainCombiner]&#xD;
                           , KeyValueMap[{StringJoin[#1, &amp;#034; = &amp;#034;, #2[[1]]], #2[[2]]} &amp;amp;]&#xD;
                           ]&#xD;
                    , Key[True]]&#xD;
                  , #] &amp;amp;&#xD;
               , Values, Apply@Join, SortBy[Last], #[[;; , 1]] &amp;amp;&#xD;
               , Riffle[#, &amp;#034;\n&amp;#034;] &amp;amp;, StringJoin&#xD;
               ]&#xD;
        ];&#xD;
```&#xD;
&#xD;
## Anchor the world&#xD;
&#xD;
Before any rendering happens, a scene is necessary. The center of our scene is going to be the origin of our &amp;#034;world&amp;#034; coordinate system.&#xD;
&#xD;
```&#xD;
stageCenter = {0, 0, 0};&#xD;
```&#xD;
&#xD;
## Lighting&#xD;
&#xD;
### Spotlight&#xD;
&#xD;
On a stage, the most indispensable thing is lighting. Among all the lights, nothing attracts more attention than a spotlight.&#xD;
To project a spotlight onto the stage, we need to determine its position, aiming target, color, size, etc. (Reference: `&amp;#034;Spot&amp;#034;` in documentation of [`Lighting`](http://reference.wolfram.com/language/ref/Lighting.html) )&#xD;
&#xD;
```&#xD;
spotLightPos = {4.5, 0, 6};&#xD;
&#xD;
Clear[spotLight]&#xD;
spotLight := &amp;lt;|&amp;#034;type&amp;#034; -&amp;gt; &amp;#034;spot&amp;#034;&#xD;
               , &amp;#034;position&amp;#034; -&amp;gt; spotLightPos, &amp;#034;target&amp;#034; -&amp;gt; stageCenter&#xD;
               , &amp;#034;color&amp;#034; -&amp;gt; RGBColor[1, 1, 1], &amp;#034;power&amp;#034; -&amp;gt; 100, &amp;#034;efficency&amp;#034; -&amp;gt; 17, &#xD;
               &amp;#034;gain&amp;#034; -&amp;gt; {1, 1, 1}&#xD;
               , &amp;#034;coneangle&amp;#034; -&amp;gt; 14 \[Degree] (* &amp;lt;- half angle of the cone *)&#xD;
               , &amp;#034;conedeltaangle&amp;#034; -&amp;gt; 1 \[Degree]&#xD;
               |&amp;gt; //&#xD;
      Function[cfg&#xD;
               , cfg // &#xD;
                RightComposition @@ &#xD;
                 MapThread[&#xD;
                     If[KeyExistsQ[cfg, #2], MapAt[##], Identity] &amp;amp;&#xD;
                        , {&#xD;
                             {&amp;#034;color&amp;#034;         , List @@ ColorConvert[#, &amp;#034;RGB&amp;#034;] &amp;amp;}&#xD;
                           , {&amp;#034;transformation&amp;#034;, TransformationMatrix /* Transpose /* Flatten}&#xD;
                           , {&amp;#034;coneangle&amp;#034;     , N[#]/Degree &amp;amp;}&#xD;
                           , {&amp;#034;conedeltaangle&amp;#034;, N[#]/Degree &amp;amp;}&#xD;
                          }\[Transpose] // Reverse]&#xD;
               ]&#xD;
```&#xD;
&#xD;
For more detail about lighting in LuxCoreRender, please refer to the [official wiki](https://wiki.luxcorerender.org/Lighting) and [document page](https://wiki.luxcorerender.org/LuxCore_SDL_Reference_Manual_v2.2#Light_sources).&#xD;
&#xD;
## Camera&#xD;
&#xD;
Rendering is like taking a photo, you carefully set up your camera, adjust the aperture and focal distance, and look for the the best angle of view.&#xD;
&#xD;
We set up a camera aiming at the stage from an angle somewhat perpendicular to the light and focusing on the center. With this setting, we hope to capture a rich shades of light and shadow.&#xD;
&#xD;
### Perspective Camera&#xD;
&#xD;
```&#xD;
cameraPos        = {7., -7., 5.};&#xD;
cameraTarget     = stageCenter + {-.2, 0, -.3};&#xD;
cameraAimingAxis = cameraTarget - cameraPos;&#xD;
cameraFocus      = stageCenter;&#xD;
```&#xD;
&#xD;
`cameraUpVecF[` $\theta$`,cameraAimingAxis]` will be used to adjust the orientation of our film. Setting $\theta$ to $0^\circ$ and $90^\circ$ correspond to landscape and portrait.&#xD;
&#xD;
```&#xD;
Clear[cameraUpVecF]&#xD;
cameraUpVecF = Function[{\[Theta], aimingAxis}, aimingAxis //&#xD;
    RightComposition[&#xD;
          RotationTransform[90 \[Degree], {aimingAxis, {0, 0, 1}}]&#xD;
          , RotationTransform[\[Theta], aimingAxis]&#xD;
          , Developer`FromPackedArray (* &amp;lt;- Currently our luxcoreWriter cannot handle PackedArray correctly *)&#xD;
          ]];&#xD;
```&#xD;
```&#xD;
Clear[camera]&#xD;
camera := &amp;lt;|&#xD;
     &amp;#034;type&amp;#034;          -&amp;gt; &amp;#034;perspective&amp;#034;&#xD;
     (* (* use the default clipping *)&#xD;
        ,&amp;#034;cliphither&amp;#034; -&amp;gt; 0           (* &amp;lt;- Working like minimum distance in M&amp;#039;s ViewRange *)&#xD;
        ,&amp;#034;clipyon&amp;#034;    -&amp;gt; \[Infinity] (* &amp;lt;- Working like maximum distance in M&amp;#039;s ViewRange *)&#xD;
     *)&#xD;
   , &amp;#034;lookat&amp;#034;        -&amp;gt; &amp;lt;|&amp;#034;orig&amp;#034; -&amp;gt; cameraPos, &amp;#034;target&amp;#034; -&amp;gt; cameraTarget|&amp;gt;&#xD;
   , &amp;#034;up&amp;#034;            -&amp;gt; cameraUpVecF[0 \[Degree], cameraAimingAxis]&#xD;
   , &amp;#034;fieldofview&amp;#034;   -&amp;gt; 25 \[Degree]  (* &amp;lt;- Working like M&amp;#039;s ViewAngle *)&#xD;
   , &amp;#034;lensradius&amp;#034;    -&amp;gt; 0.1           (* &amp;lt;- Set a positive value to enable blurring by depth-of-field, 0 to disable it. *)&#xD;
   , &amp;#034;focaldistance&amp;#034; -&amp;gt; Norm[cameraFocus - cameraPos]&#xD;
   |&amp;gt; // Function[cfg&#xD;
   , cfg // RightComposition @@&#xD;
     MapThread[If[KeyExistsQ[cfg, #2], MapAt[##], Identity] &amp;amp;, {&#xD;
         (* clip-range must be positive real number in luxcore renderer: *)&#xD;
           {&amp;#034;cliphither&amp;#034;  , Clip[#, {.001, 1. 10^30}] &amp;amp;}&#xD;
         , {&amp;#034;clipyon&amp;#034;     , Clip[#, {.001, 1. 10^30}] &amp;amp;}&#xD;
         , {&amp;#034;fieldofview&amp;#034; , N[#]/Degree &amp;amp;}&#xD;
         }\[Transpose] // Reverse]&#xD;
   ]&#xD;
```&#xD;
&#xD;
For more detail about camera, please refer to the [official wiki](https://wiki.luxcorerender.org/LuxCoreRender_Cameras) and [document page](https://wiki.luxcorerender.org/LuxCore_SDL_Reference_Manual_v2.2#Camera).&#xD;
&#xD;
## Object&#xD;
&#xD;
In LuxCoreRender, solid objects can be represented by meshes and materials. The easiest way to introduce mesh is through the [PLY format](http://reference.wolfram.com/language/ref/format/PLY.html). It&amp;#039;s well-supported both in LuxCoreRender and in Mathematica.&#xD;
&#xD;
### Stage plane&#xD;
&#xD;
We have been talking a lot about stage, but we haven&amp;#039;t really got a stage yet. We are goint to forge a minimalist one with a square plane.&#xD;
&#xD;
```&#xD;
stage     = Polygon[Append[0] /@ CirclePoints[{5, \[Pi]/4}, 4]] // TranslationTransform[{0, 0, -1.7}];&#xD;
stageMesh = stage // DiscretizeRegion // Export[&amp;#034;stage_plane.ply&amp;#034;, #, &amp;#034;PLY&amp;#034;, &amp;#034;BinaryFormat&amp;#034; -&amp;gt; True] &amp;amp;;&#xD;
```&#xD;
&#xD;
We use LuxCoreRender&amp;#039;s built-in Lambertian material to style our stage, with a neural gray diffuse color.&#xD;
&#xD;
```&#xD;
stageMaterial = &amp;lt;|&amp;#034;type&amp;#034; -&amp;gt; &amp;#034;matte&amp;#034;, &amp;#034;kd&amp;#034; -&amp;gt; {0.5, 0.5, 0.5}|&amp;gt;;&#xD;
```&#xD;
&#xD;
The full configuration is&#xD;
&#xD;
```&#xD;
stageConfig = &amp;lt;|&#xD;
     &amp;#034;objects&amp;#034;   -&amp;gt; &amp;lt;|&amp;#034;stage_plane&amp;#034; -&amp;gt; &amp;lt;|&amp;#034;ply&amp;#034; -&amp;gt; stageMesh, &amp;#034;material&amp;#034; -&amp;gt; &amp;#034;stage_mat&amp;#034;|&amp;gt;|&amp;gt;&#xD;
   , &amp;#034;materials&amp;#034; -&amp;gt; &amp;lt;|&amp;#034;stage_mat&amp;#034;   -&amp;gt; stageMaterial                                    |&amp;gt;&#xD;
   |&amp;gt;;&#xD;
```&#xD;
&#xD;
### Main Role&#xD;
&#xD;
Now everything on the stage is ready except for the star. We are going to put a diamond at the center.&#xD;
&#xD;
[TruncatedPolyhedron](http://reference.wolfram.com/language/ref/TruncatedPolyhedron.html) is a handy function to cut a gem. With some additional positioning, in two lines we have our diamond mesh ready.&#xD;
&#xD;
```&#xD;
gem = Fold[TruncatedPolyhedron, Octahedron[], ConstantArray[.3, 2]] //&#xD;
             ScalingTransform[1.3 {1, 1, 1}] //&#xD;
             RotationTransform[-10 Degree, {0, 0, 1}];&#xD;
gem // Graphics3D[#, Boxed -&amp;gt; False] &amp;amp;&#xD;
gemMesh = gem // DiscretizeGraphics // RegionBoundary // Export[&amp;#034;gem.ply&amp;#034;, #, &amp;#034;PLY&amp;#034;, &amp;#034;BinaryFormat&amp;#034; -&amp;gt; True] &amp;amp;;&#xD;
```&#xD;
&#xD;
![gem mesh as Graphics3D][2]&#xD;
&#xD;
Recall our camera is focusing on stage center, now we would like to adjust it to focus on the top facet of the diamond.&#xD;
&#xD;
```&#xD;
gemBd   = gem   // BoundaryDiscretizeGraphics;&#xD;
facetID = gemBd // {MeshCellIndex[#, 2], PropertyValue[{#, 2}, MeshCellCentroid]}\[Transpose] &amp;amp; // &#xD;
                   MaximalBy[#[[2, -1]] &amp;amp;] // #[[1, 1]] &amp;amp;;&#xD;
```&#xD;
&#xD;
Or, more specifically, on one of the facet&amp;#039;s corners nearest to our camera. Because we used delayed definition (reference: [`SetDelayed`](http://reference.wolfram.com/language/ref/SetDelayed.html)), our `camera` will be automatically up-to-date everytime we query it.&#xD;
&#xD;
```&#xD;
cameraFocus = MeshPrimitives[gemBd, facetID][[1]] // MinimalBy[Norm[# - cameraPos] &amp;amp;] // First;&#xD;
Graphics3D[{gem, {Red, Sphere[cameraFocus, .05]}}&#xD;
           , ViewVector -&amp;gt; Values[camera[&amp;#034;lookat&amp;#034;]]&#xD;
           , Boxed -&amp;gt; False]&#xD;
```&#xD;
&#xD;
![gem mesh, highlight the focus-on point][3]&#xD;
&#xD;
To emulate a real diamond, we need LuxCoreRender&amp;#039;s built-in &amp;#034;glass&amp;#034; material with refractive index of diamond.&#xD;
&#xD;
```&#xD;
gemIOR = Entity[&amp;#034;Mineral&amp;#034;, &amp;#034;Diamond&amp;#034;]@EntityProperty[&amp;#034;Mineral&amp;#034;, &amp;#034;RefractiveIndices&amp;#034;] // First&#xD;
(* Out[]= 2.418 *)&#xD;
```&#xD;
&#xD;
For the dispersion, [LuxCoreRender uses Cauchy&amp;#039;s empirical formula](https://wiki.luxcorerender.org/LuxCore_SDL_Reference_Manual_v2.2#Type:_glass). We&amp;#039;ll set it to a very high value. (For details about the formula, please refer to http://scienceworld.wolfram.com/physics/CauchysFormula.html .)&#xD;
&#xD;
```&#xD;
gemDispersion = 0.02;&#xD;
&#xD;
gemMaterial = &amp;lt;|&#xD;
     &amp;#034;type&amp;#034;        -&amp;gt; &amp;#034;glass&amp;#034;&#xD;
   , &amp;#034;kr&amp;#034;          -&amp;gt; {1, 1, 1}     (* &amp;lt;- reflected color *)&#xD;
   , &amp;#034;kt&amp;#034;          -&amp;gt; {1, 1, 1}     (* &amp;lt;- transmited color *)&#xD;
   , &amp;#034;interiorior&amp;#034; -&amp;gt; gemIOR        (* &amp;lt;- refractive index *)&#xD;
   , &amp;#034;cauchyc&amp;#034;     -&amp;gt; gemDispersion (* &amp;lt;- coefficient of \[Lambda]^-2 term in Cauchy&amp;#039;s dispersion formula *)&#xD;
   |&amp;gt;;&#xD;
```&#xD;
&#xD;
The full configuration is&#xD;
&#xD;
```&#xD;
gemConfig = &amp;lt;|&#xD;
     &amp;#034;objects&amp;#034;   -&amp;gt; &amp;lt;|&amp;#034;gem&amp;#034;     -&amp;gt; &amp;lt;|&amp;#034;ply&amp;#034; -&amp;gt; gemMesh, &amp;#034;material&amp;#034; -&amp;gt; &amp;#034;gem_mat&amp;#034;|&amp;gt;|&amp;gt;&#xD;
   , &amp;#034;materials&amp;#034; -&amp;gt; &amp;lt;|&amp;#034;gem_mat&amp;#034; -&amp;gt; gemMaterial                                  |&amp;gt;&#xD;
   |&amp;gt;;&#xD;
```&#xD;
&#xD;
## The Whole Scene&#xD;
&#xD;
Combining all the lights, camera and objects together, we have a full specification of our scene.&#xD;
&#xD;
```&#xD;
sceneConfig = &amp;lt;|&amp;#034;scene&amp;#034; -&amp;gt; Join[&#xD;
     &amp;lt;|&#xD;
        &amp;#034;camera&amp;#034; -&amp;gt; camera&#xD;
      , &amp;#034;lights&amp;#034; -&amp;gt; &amp;lt;|&amp;#034;spot_light_1&amp;#034; -&amp;gt; spotLight|&amp;gt;&#xD;
      |&amp;gt;&#xD;
     , Merge[{stageConfig, gemConfig}, Apply@Join]&#xD;
     ]|&amp;gt;;&#xD;
```&#xD;
&#xD;
We use our exporter `luxcoreWriter` to format the scene configuration to match LuxCoreRender&amp;#039;s scn DSL specification.&#xD;
&#xD;
```&#xD;
sceneFile = sceneConfig // luxcoreWriter // Export[&amp;#034;diamond.scn&amp;#034;, #, &amp;#034;String&amp;#034;] &amp;amp;&#xD;
(* Out[]= diamond.scn *)&#xD;
```&#xD;
&#xD;
## Render&#xD;
&#xD;
Our post is already lengthy so I&amp;#039;m not going into details about the render configuration here. Just to point out that our importer/exporter can handle both scene describing file (.scn) and render configuration file (.cfg).&#xD;
&#xD;
Another thing worth reminding readers not familiar with rendering is the choice of render strategy. As in this case we would like to see nice caustic from the diamond, **&amp;#034;BIDIRCPU&amp;#034;** engine coupled with **&amp;#034;METROPOLIS&amp;#034;** sampling is our best choice. We may talk about other methods in the future.&#xD;
&#xD;
```&#xD;
renderConfig = &amp;lt;|&#xD;
   &amp;#034;scene&amp;#034;          -&amp;gt; &amp;lt;|&amp;#034;file&amp;#034;     -&amp;gt; sceneFile   |&amp;gt;&#xD;
   , &amp;#034;renderengine&amp;#034; -&amp;gt; &amp;lt;|&amp;#034;type&amp;#034;     -&amp;gt; &amp;#034;BIDIRCPU&amp;#034;  |&amp;gt;&#xD;
   , &amp;#034;sampler&amp;#034;      -&amp;gt; &amp;lt;|&amp;#034;type&amp;#034;     -&amp;gt; &amp;#034;METROPOLIS&amp;#034;|&amp;gt;&#xD;
   , &amp;#034;light&amp;#034;        -&amp;gt; &amp;lt;|&amp;#034;maxdepth&amp;#034; -&amp;gt; 20          |&amp;gt;&#xD;
   , &amp;#034;path&amp;#034;         -&amp;gt; &amp;lt;|&amp;#034;maxdepth&amp;#034; -&amp;gt; 20          |&amp;gt;&#xD;
   , &amp;#034;batch&amp;#034;        -&amp;gt; &amp;lt;|&amp;#034;haltspp&amp;#034;  -&amp;gt; 2000        |&amp;gt;&#xD;
   , &amp;#034;film&amp;#034; -&amp;gt; &amp;lt;|&#xD;
                  &amp;#034;width&amp;#034; -&amp;gt; 1920, &amp;#034;height&amp;#034; -&amp;gt; 1080&#xD;
                , &amp;#034;filter&amp;#034; -&amp;gt; &amp;lt;|&amp;#034;type&amp;#034; -&amp;gt; &amp;#034;NONE&amp;#034;|&amp;gt;&#xD;
                , &amp;#034;imagepipelines&amp;#034; -&amp;gt; &amp;lt;| &amp;#034;0&amp;#034; -&amp;gt; &amp;lt;| &amp;#034;0&amp;#034; -&amp;gt; &amp;lt;|&amp;#034;type&amp;#034; -&amp;gt; &amp;#034;TONEMAP_AUTOLINEAR&amp;#034;               |&amp;gt;&#xD;
                                                 , &amp;#034;1&amp;#034; -&amp;gt; &amp;lt;|&amp;#034;type&amp;#034; -&amp;gt; &amp;#034;GAMMA_CORRECTION&amp;#034;, &amp;#034;value&amp;#034; -&amp;gt; 2.5`|&amp;gt;&#xD;
                                                 |&amp;gt;&#xD;
                                       , &amp;#034;1&amp;#034; -&amp;gt; &amp;lt;| &amp;#034;0&amp;#034; -&amp;gt; &amp;lt;|&amp;#034;type&amp;#034; -&amp;gt; &amp;#034;INTEL_OIDN&amp;#034;                       |&amp;gt;&#xD;
                                                 , &amp;#034;1&amp;#034; -&amp;gt; &amp;lt;|&amp;#034;type&amp;#034; -&amp;gt; &amp;#034;TONEMAP_AUTOLINEAR&amp;#034;               |&amp;gt;&#xD;
                                                 , &amp;#034;2&amp;#034; -&amp;gt; &amp;lt;|&amp;#034;type&amp;#034; -&amp;gt; &amp;#034;GAMMA_CORRECTION&amp;#034;, &amp;#034;value&amp;#034; -&amp;gt; 2.5`|&amp;gt;&#xD;
                                                 |&amp;gt;&#xD;
                                       |&amp;gt;&#xD;
                , &amp;#034;outputs&amp;#034; -&amp;gt; &amp;lt;| &amp;#034;0&amp;#034; -&amp;gt; &amp;lt;|&amp;#034;index&amp;#034; -&amp;gt; 0, &amp;#034;type&amp;#034; -&amp;gt; &amp;#034;RGB_IMAGEPIPELINE&amp;#034;, &amp;#034;filename&amp;#034; -&amp;gt; &amp;#034;diamond_original.png&amp;#034;|&amp;gt;&#xD;
                                , &amp;#034;1&amp;#034; -&amp;gt; &amp;lt;|&amp;#034;index&amp;#034; -&amp;gt; 1, &amp;#034;type&amp;#034; -&amp;gt; &amp;#034;RGB_IMAGEPIPELINE&amp;#034;, &amp;#034;filename&amp;#034; -&amp;gt; &amp;#034;diamond_denoised.png&amp;#034;|&amp;gt;&#xD;
                                |&amp;gt;&#xD;
                |&amp;gt;&#xD;
   |&amp;gt;;&#xD;
&#xD;
renderCfgFile = renderConfig // luxcoreWriter // Export[&amp;#034;diamond.cfg&amp;#034;, #, &amp;#034;String&amp;#034;] &amp;amp;&#xD;
```&#xD;
&#xD;
We can import the configuration file and browse it with [`Dataset`](http://reference.wolfram.com/language/ref/Dataset.html) as easy.&#xD;
&#xD;
```&#xD;
Import[renderCfgFile, &amp;#034;String&amp;#034;] // StringTrim // luxcoreReader // Dataset&#xD;
```&#xD;
&#xD;
![rendering configuration][4]&#xD;
&#xD;
To start rendering, we use the portable **LuxCoreRender Standalone release v2.2** from https://luxcorerender.org/download/ . (The one I used is *Python 3.7 and Windows 64bit with OpenCL support*).&#xD;
&#xD;
After extracting you&amp;#039;ll find two executables *pyluxcoretool* and *luxcoreui*. You can either start the renderer from console by executing (from our working directory, which is `Directory[]`)&#xD;
&#xD;
    pyluxcoretool console diamond.cfg&#xD;
&#xD;
or by loading the .cfg file through **Rendering &amp;#187; Load** menu in *luxcoreui* (which has a simple GUI).&#xD;
&#xD;
To be lazy and stay in Mathematica, on Windows you can invoke the console as following. (Please remember to change the `luxbin` to your actual path.)&#xD;
&#xD;
````&#xD;
luxbin = &amp;#034;path\\to\\your\\pyluxcoretool.exe&amp;#034;;&#xD;
&#xD;
luxPS = StartProcess[{$SystemShell, &amp;#034;/C&amp;#034;,&#xD;
   StringTemplate[&#xD;
     &amp;#034;start /I /WAIT cmd /C \&amp;#034;\&amp;#034;`bin`\&amp;#034; console \&amp;#034;`cfg`\&amp;#034;\&amp;#034;&amp;#034;][&amp;lt;|&#xD;
     &amp;#034;bin&amp;#034; -&amp;gt; luxbin,&#xD;
     &amp;#034;cfg&amp;#034; -&amp;gt; renderCfgFile&#xD;
     |&amp;gt;]&#xD;
   }]&#xD;
````&#xD;
&#xD;
When the render process finishes, we shall find the results under our current working directory with filenames according to `renderConfig[[&amp;#034;film&amp;#034;,&amp;#034;outputs&amp;#034;,;;,&amp;#034;filename&amp;#034;]]`.&#xD;
&#xD;
We have set the render halt condition to 2000 samples (`renderConfig[[&amp;#034;batch&amp;#034;,&amp;#034;haltspp&amp;#034;]]`). For this scene it will take almost 1 hour on a computer with Intel Core i7-8750H CPU. (&amp;#034;BIDIRCPU&amp;#034; engine does most job on CPU.) For quick preview rendering, you can set it to smaller values (e.g. 20). Smaller `renderConfig[[&amp;#034;film&amp;#034;,{&amp;#034;width&amp;#034;,&amp;#034;height&amp;#034;}]]` will also reduce the computing time (with smaller resulting images). Additionally, you can always break the render process anytime (on Windows press **Ctrl + C** in cmd, or click the **Rendering &amp;#187; Pause** menu in *luxcoreui*). The intermediate result will be saved to disk for the console mode, or can be manually saved from **Film &amp;#187; Save outputs** menu in *luxcoreui*.&#xD;
&#xD;
## Epilog&#xD;
&#xD;
Hopefully you have got a beautiful photo of the diamond like our cover image.&#xD;
&#xD;
So far we have demonstrated in Mathematica how to setup a simple scene and render it with LuxCoreRender. We have plans to talk about more advanced topics (like HDRI, volume, how to generate smooth mesh, etc.) in the fufuture.&#xD;
&#xD;
## References&#xD;
&#xD;
 - [Visualization: Advanced 3D Graphics](https://www.wolfram.com/training/videos/GEN422/)&#xD;
 &#xD;
 - [Extract values for ViewMatrix from a Graphics3D](https://mathematica.stackexchange.com/questions/3528/extract-values-for-viewmatrix-from-a-graphics3d)&#xD;
 &#xD;
 - [LuxCore SDL Reference Manual v2.2](https://wiki.luxcorerender.org/LuxCore_SDL_Reference_Manual_v2.2)&#xD;
 &#xD;
 - [LuxCoreRender User&amp;#039;s Manual](https://wiki.luxcorerender.org/LuxCoreRender_User%27s_Manual)&#xD;
 &#xD;
 - [LuxCore test scenes project](https://github.com/LuxCoreRender/LuxCoreTestScenes)&#xD;
&#xD;
## Bounus&#xD;
&#xD;
 - Click the cover image to get the full sized version!&#xD;
&#xD;
 - Find the Notebook for this post below!&#xD;
&#xD;
&#xD;
  [1]: https://community.wolfram.com//c/portal/getImageAttachment?filename=diamond_denoised_small.png&amp;amp;userId=93201&#xD;
  [2]: https://community.wolfram.com//c/portal/getImageAttachment?filename=gem_mesh.png&amp;amp;userId=93201&#xD;
  [3]: https://community.wolfram.com//c/portal/getImageAttachment?filename=gem_mesh_with_focus_point.png&amp;amp;userId=93201&#xD;
  [4]: https://community.wolfram.com//c/portal/getImageAttachment?filename=render_cfg.png&amp;amp;userId=93201</description>
    <dc:creator>Silvia Hao</dc:creator>
    <dc:date>2019-10-22T15:49:06Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/1801300">
    <title>Ray-tracing Graphics3D with UnityLink</title>
    <link>https://community.wolfram.com/groups/-/m/t/1801300</link>
    <description>[![Rotating stage][13]][14]&#xD;
&#xD;
Here is an example of a basic rendered scene by calling Unity from Mathematica.  Of course, a lot of work can be required to make it pretty.&#xD;
&#xD;
    Needs[&amp;#034;UnityLink`&amp;#034;];&#xD;
    UnityOpen[&amp;#034;SphereLighting&amp;#034;];&#xD;
    CreateUnityAssetDirectory[{&amp;#034;Scenes&amp;#034;, &amp;#034;Meshes&amp;#034;, &amp;#034;Materials&amp;#034;}];&#xD;
    CreateUnityScene[File[&amp;#034;Scenes/SphereLight&amp;#034;]];&#xD;
    boxwalls = {CreateUnityPlane[&amp;#034;Box_bottom&amp;#034;, &#xD;
        Properties -&amp;gt; {&amp;#034;Position&amp;#034; -&amp;gt; {0, 0, 0}, &amp;#034;LocalScale&amp;#034; -&amp;gt; {1, 1, 1},&#xD;
           &amp;#034;EulerAngles&amp;#034; -&amp;gt; {0, 0, 0}}],&#xD;
       CreateUnityPlane[&amp;#034;Box_top&amp;#034;, &#xD;
        Properties -&amp;gt; {&amp;#034;Position&amp;#034; -&amp;gt; {0, 10, 0}, &#xD;
          &amp;#034;LocalScale&amp;#034; -&amp;gt; {1, 1, 1}, &amp;#034;EulerAngles&amp;#034; -&amp;gt; {0, 0, 0}}],&#xD;
       CreateUnityPlane[&amp;#034;Box_left&amp;#034;, &#xD;
        Properties -&amp;gt; {&amp;#034;Position&amp;#034; -&amp;gt; {-5, 5, 0}, &#xD;
          &amp;#034;LocalScale&amp;#034; -&amp;gt; {1, 1, 1}, &amp;#034;EulerAngles&amp;#034; -&amp;gt; {0, 0, 90}}],&#xD;
       CreateUnityPlane[&amp;#034;Box_right&amp;#034;, &#xD;
        Properties -&amp;gt; {&amp;#034;Position&amp;#034; -&amp;gt; {5, 5, 0}, &amp;#034;LocalScale&amp;#034; -&amp;gt; {1, 1, 1},&#xD;
           &amp;#034;EulerAngles&amp;#034; -&amp;gt; {0, 0, 90}}],&#xD;
       CreateUnityPlane[&amp;#034;Box_back&amp;#034;, &#xD;
        Properties -&amp;gt; {&amp;#034;Position&amp;#034; -&amp;gt; {0, 5, 5}, &amp;#034;LocalScale&amp;#034; -&amp;gt; {1, 1, 1},&#xD;
           &amp;#034;EulerAngles&amp;#034; -&amp;gt; {90, 0, 0}}]};&#xD;
    hexToRGB = &#xD;
      RGBColor @@ (IntegerDigits[#~StringDrop~1~FromDigits~16, 256, 3]/&#xD;
          255.) &amp;amp;;&#xD;
    boxmat = CreateUnityMaterial[File[&amp;#034;Materials/Mat_box&amp;#034;], &#xD;
       Properties -&amp;gt; {&amp;#034;Color&amp;#034; -&amp;gt; hexToRGB[&amp;#034;#F9F9F9&amp;#034;], &#xD;
         &amp;#034;_Glossiness&amp;#034; -&amp;gt; 0.0}];&#xD;
    SetProperty[#, &amp;#034;SharedMaterial&amp;#034; -&amp;gt; boxmat] &amp;amp; /@ boxwalls;&#xD;
    sphere = CreateUnityGameObject[&amp;#034;sphere&amp;#034;, &#xD;
       Graphics3D[Sphere[{0, 0, 0}]], &#xD;
       Properties -&amp;gt; {&amp;#034;Position&amp;#034; -&amp;gt; {0, 1, 2}}];&#xD;
    spheremat = &#xD;
      CreateUnityMaterial[File[&amp;#034;Materials/Mat_sphere&amp;#034;], &#xD;
       Properties -&amp;gt; {&amp;#034;Color&amp;#034; -&amp;gt; hexToRGB[&amp;#034;#F9F9F9&amp;#034;], &#xD;
         &amp;#034;_Glossiness&amp;#034; -&amp;gt; 0.75}];&#xD;
    SetProperty[sphere, &amp;#034;SharedMaterial&amp;#034; -&amp;gt; spheremat];&#xD;
    UnityCameraImage[]&#xD;
    positions = {{5, 1, 0}, {0, 1, 5}, {-5, 1, 0}, {0, 1, -5}, {0, 10, 0}};&#xD;
    angles = {{0, 270, 0}, {0, 180, 0}, {0, 90, 0}, {0, 0, 0}, {90, 0, 0}};&#xD;
    camera = CreateUnityCamera[&amp;#034;Test Camera&amp;#034;];&#xD;
    go = camera[[&amp;#034;GameObject&amp;#034;]];&#xD;
    views = Table[go[[&amp;#034;Position&amp;#034;]] = positions[[i]];&#xD;
       go[[&amp;#034;EulerAngles&amp;#034;]] = angles[[i]];&#xD;
       UnityCameraImage[camera, ImageResolution -&amp;gt; 800, &#xD;
        ImageSize -&amp;gt; 300], {i, Length[positions]}];&#xD;
    DeleteUnityGameObject[go]&#xD;
    ListAnimate[views]&#xD;
&#xD;
[![Glossy Sphere In Unity][1]][2]&#xD;
&#xD;
# Update to Include Material Shading Variations&#xD;
From the OP&amp;#039;s comment, they would like to see shading like metallics and crystals.  Ideally, one would use a Principled Based Shader to achieve this result.  It looks like the standard offering through Unity is rather basic, but literally, there are hundreds of commercial offerings to create realistic materials.  In the standard shader, there was no Index of Refraction _IOR_ setting, so the best you can get is transparency.&#xD;
&#xD;
Here is an example of three materials from left to right of a metallic, transparent, and a diffuse material.  I also added a Reflection Probe, but I had to manually bake the scene to see the effect.&#xD;
&#xD;
    Needs[&amp;#034;UnityLink`&amp;#034;];&#xD;
    UnityOpen[&amp;#034;SphereLighting&amp;#034;];&#xD;
    CreateUnityAssetDirectory[{&amp;#034;Scenes&amp;#034;, &amp;#034;Meshes&amp;#034;, &amp;#034;Materials&amp;#034;}];&#xD;
    CreateUnityScene[File[&amp;#034;Scenes/SphereLight&amp;#034;]];&#xD;
    boxwalls = {CreateUnityPlane[&amp;#034;Box_bottom&amp;#034;, &#xD;
        Properties -&amp;gt; {&amp;#034;Position&amp;#034; -&amp;gt; {0, 0, 0}, &amp;#034;LocalScale&amp;#034; -&amp;gt; {1, 1, 1},&#xD;
           &amp;#034;EulerAngles&amp;#034; -&amp;gt; {0, 0, 0}}],&#xD;
       CreateUnityPlane[&amp;#034;Box_top&amp;#034;, &#xD;
        Properties -&amp;gt; {&amp;#034;Position&amp;#034; -&amp;gt; {0, 10, 0}, &#xD;
          &amp;#034;LocalScale&amp;#034; -&amp;gt; {1, 1, 1}, &amp;#034;EulerAngles&amp;#034; -&amp;gt; {0, 0, 180}}],&#xD;
       CreateUnityPlane[&amp;#034;Box_left&amp;#034;, &#xD;
        Properties -&amp;gt; {&amp;#034;Position&amp;#034; -&amp;gt; {-5, 5, 0}, &#xD;
          &amp;#034;LocalScale&amp;#034; -&amp;gt; {1, 1, 1}, &amp;#034;EulerAngles&amp;#034; -&amp;gt; {0, 0, -90}}],&#xD;
       CreateUnityPlane[&amp;#034;Box_right&amp;#034;, &#xD;
        Properties -&amp;gt; {&amp;#034;Position&amp;#034; -&amp;gt; {5, 5, 0}, &amp;#034;LocalScale&amp;#034; -&amp;gt; {1, 1, 1},&#xD;
           &amp;#034;EulerAngles&amp;#034; -&amp;gt; {0, 0, 90}}],&#xD;
       CreateUnityPlane[&amp;#034;Box_back&amp;#034;, &#xD;
        Properties -&amp;gt; {&amp;#034;Position&amp;#034; -&amp;gt; {0, 5, 5}, &amp;#034;LocalScale&amp;#034; -&amp;gt; {1, 1, 1},&#xD;
           &amp;#034;EulerAngles&amp;#034; -&amp;gt; {-90, 0, 0}}]};&#xD;
    hexToRGB = &#xD;
      RGBColor @@ (IntegerDigits[#~StringDrop~1~FromDigits~16, 256, 3]/&#xD;
          255.) &amp;amp;;&#xD;
    boxmat = CreateUnityMaterial[File[&amp;#034;Materials/Mat_box&amp;#034;], &#xD;
       Properties -&amp;gt; {&amp;#034;Color&amp;#034; -&amp;gt; hexToRGB[&amp;#034;#F9F9F9&amp;#034;], &#xD;
         &amp;#034;_Glossiness&amp;#034; -&amp;gt; 0.0}];&#xD;
    SetProperty[#, &amp;#034;SharedMaterial&amp;#034; -&amp;gt; boxmat] &amp;amp; /@ boxwalls;&#xD;
    sphere1 = &#xD;
      CreateUnityGameObject[&amp;#034;sphere1&amp;#034;, Graphics3D[Sphere[{0, 0, 0}]], &#xD;
       Properties -&amp;gt; {&amp;#034;Position&amp;#034; -&amp;gt; {2.25, 1, 2}}];&#xD;
    sphere1mat = &#xD;
      CreateUnityMaterial[File[&amp;#034;Materials/Mat_sphere1&amp;#034;], &#xD;
       Properties -&amp;gt; {&amp;#034;Color&amp;#034; -&amp;gt; hexToRGB[&amp;#034;#27EC86&amp;#034;], &#xD;
         &amp;#034;_Glossiness&amp;#034; -&amp;gt; 0.75}];&#xD;
    SetProperty[sphere1, &amp;#034;SharedMaterial&amp;#034; -&amp;gt; sphere1mat];&#xD;
    sphere2 = &#xD;
      CreateUnityGameObject[&amp;#034;sphere2&amp;#034;, Graphics3D[Sphere[{0, 0, 0}]], &#xD;
       Properties -&amp;gt; {&amp;#034;Position&amp;#034; -&amp;gt; {0, 1, 2}}];&#xD;
    sphere2mat = &#xD;
      CreateUnityMaterial[File[&amp;#034;Materials/Mat_sphere2&amp;#034;], &#xD;
       Properties -&amp;gt; {&amp;#034;RenderQueue&amp;#034; -&amp;gt; 3000, &#xD;
         &amp;#034;Color&amp;#034; -&amp;gt; &#xD;
          RGBColor[0.830726683139801, 0.8980392217636108, &#xD;
           0.33333340287208557, 0.4156862795352936], &amp;#034;_Glossiness&amp;#034; -&amp;gt; 1, &#xD;
         &amp;#034;_GlossMapScale&amp;#034; -&amp;gt; 0.8, &amp;#034;_Metallic&amp;#034; -&amp;gt; 0.1, &#xD;
         &amp;#034;ShaderKeywords&amp;#034; -&amp;gt; {&amp;#034;_ALPHAPREMULTIPLY_ON&amp;#034;, &#xD;
           &amp;#034;_SMOOTHNESS_TEXTURE_ALBEDO_CHANNEL_A&amp;#034;}, &amp;#034;_Mode&amp;#034; -&amp;gt; 3, &#xD;
         &amp;#034;_DstBlend&amp;#034; -&amp;gt; 10, &amp;#034;_ZWrite&amp;#034; -&amp;gt; 1.}];&#xD;
    SetProperty[sphere2, &amp;#034;SharedMaterial&amp;#034; -&amp;gt; sphere2mat];&#xD;
    CreateUnityReflectionProbe[sphere2];&#xD;
    sphere3 = &#xD;
      CreateUnityGameObject[&amp;#034;sphere3&amp;#034;, Graphics3D[Sphere[{0, 0, 0}]], &#xD;
       Properties -&amp;gt; {&amp;#034;Position&amp;#034; -&amp;gt; {-2.25, 1, 2}}];&#xD;
    sphere3mat = &#xD;
      CreateUnityMaterial[File[&amp;#034;Materials/Mat_sphere3&amp;#034;], &#xD;
       Properties -&amp;gt; {&amp;#034;Color&amp;#034; -&amp;gt; RGBColor[1, 1, 1], &amp;#034;_Glossiness&amp;#034; -&amp;gt; 1, &#xD;
         &amp;#034;_Metallic&amp;#034; -&amp;gt; 1}];&#xD;
    SetProperty[sphere3, &amp;#034;SharedMaterial&amp;#034; -&amp;gt; sphere3mat];&#xD;
    CreateUnityReflectionProbe[sphere3];&#xD;
    camera = FindUnityComponent[&amp;#034;Main Camera&amp;#034;];&#xD;
    camera[[&amp;#034;Position&amp;#034;]] = {-0.9, 3, -6.3};&#xD;
    UnityCameraImage[]&#xD;
    positions = {{5, 1, 0}, {0, 1, 5}, {-5, 1, 0}, {0, 1, -5}, {0, 10, 0}};&#xD;
    angles = {{0, 270, 0}, {0, 180, 0}, {0, 90, 0}, {0, 0, 0}, {90, 0, 0}};&#xD;
    camera = CreateUnityCamera[&amp;#034;Test Camera&amp;#034;];&#xD;
    go = camera[[&amp;#034;GameObject&amp;#034;]];&#xD;
    views = Table[go[[&amp;#034;Position&amp;#034;]] = positions[[i]];&#xD;
       go[[&amp;#034;EulerAngles&amp;#034;]] = angles[[i]];&#xD;
       UnityCameraImage[camera, ImageResolution -&amp;gt; 800, &#xD;
        ImageSize -&amp;gt; 300], {i, Length[positions]}];&#xD;
    DeleteUnityGameObject[go]&#xD;
    ListAnimate[views]&#xD;
&#xD;
[![Unbaked spheres][3]][4]&#xD;
&#xD;
After manually baking the Reflection Probes I re-displayed the camera with the following:&#xD;
&#xD;
    camera = FindUnityComponent[&amp;#034;Main Camera&amp;#034;];&#xD;
    camera[[&amp;#034;Position&amp;#034;]] = {-0.9, 3, -6.3};&#xD;
    UnityCameraImage[]&#xD;
    positions = {{5, 1, 0}, {0, 1, 5}, {-5, 1, 0}, {0, 1, -5}, {0, 10, 0}};&#xD;
    angles = {{0, 270, 0}, {0, 180, 0}, {0, 90, 0}, {0, 0, 0}, {90, 0, 0}};&#xD;
    camera = CreateUnityCamera[&amp;#034;Test Camera&amp;#034;];&#xD;
    go = camera[[&amp;#034;GameObject&amp;#034;]];&#xD;
    views = Table[go[[&amp;#034;Position&amp;#034;]] = positions[[i]];&#xD;
       go[[&amp;#034;EulerAngles&amp;#034;]] = angles[[i]];&#xD;
       UnityCameraImage[camera, ImageResolution -&amp;gt; 800, &#xD;
        ImageSize -&amp;gt; 300], {i, Length[positions]}];&#xD;
    DeleteUnityGameObject[go]&#xD;
    ListAnimate[views]&#xD;
&#xD;
[![Baked spheres][5]][6]&#xD;
&#xD;
I installed the free version of Lux from the Asset Store, but it appears that a lot of the functionality could not be installed due to licensing claims.  I have not experimented with any other shaders so I have no opinion.&#xD;
&#xD;
## Previous Rigid Body Simulation in Blender Answer&#xD;
I answered a question about flipping a coin [here](https://community.wolfram.com/groups/-/m/t/1803725) where I interfaced _Mathematica_ to _Blender_.  _Blender_ has a Principled shader in its distribution that has _IOR_ and other settings that should allow you to create a more realistic scene without needing to find a commercial option.  &#xD;
&#xD;
[![Blender Principled Shader Interface][7]][8]&#xD;
&#xD;
# Update #2: Blender 2.79b Implementation&#xD;
I am very new to the Unity universe and could use a good recommendation for principled shaders. In lieu of finding a good Unity shader, I threw together a simple implementation in Blender to utilize its shader from _Mathematica_.  To use this implementation, you will need to install Blender 2.79b and make sure that it is in your path.&#xD;
&#xD;
## Use [`StringTemplate`](https://reference.wolfram.com/language/ref/StringTemplate.html)to Create Blender Python Script&#xD;
Blender has a fairly complete Python API and there are many examples that you can grab from the web.  I created a module the generates a Blender python script that will use the photorealistic Cycles renderer to display a _Mathematica_ object that has been saved as STL.&#xD;
&#xD;
I used [`Compress`](https://reference.wolfram.com/language/ref/Compress.html) to try to preserve the important space delimited formatting of the python code that tends to get garbled here when trying to copy a python code within MMA code.  &#xD;
&#xD;
I also created 2 functions to render either sharp (like a crystal) or smooth objects (like a heart).  The following will produce Blender code and execute it in the background.  Depending on your hardware, you may need to adjust the render settings.  I am using Windows, but I think it should work on other platforms.  Because _Mathematica_ and _Blender_ need to share files, I thought the most straightforward approach was to use the NotebookDirectory.  This implies that following code needs to be executed from a previously saved notebook.&#xD;
&#xD;
    blenderworkflow[mmaobj_, frames_: 1] := &#xD;
     Module[{pre, imgset, nbd, glassRenderScript, file, fileName, &#xD;
       outputfile, stext, files, imgs},&#xD;
      nbd = NotebookDirectory[];&#xD;
      pre = StringTemplate[&#xD;
         Uncompress[&#xD;
          &amp;#034;1:eJydVl1v2zYU7Wv/\&#xD;
    Bes9yAYKTnKbrCvghyZNtgFLE8TFiiI1XEqiLC4UaZBUUv/B/a7eS+qztZO1RhCJvPee+\&#xD;
    3GOTT5L9fXyv6dPnohqq40j6Xb3KU7mhdEV2TJXSpGSxnQFy85WwaI1GJYLpizavF0bool\&#xD;
    QiEVz5hjV6b88c/Y1Wgl8REE0dbstJ4sFiS7Oln9GBIP6vbOLq/\&#xD;
    cfoy4AP5paLgGGLMh7U/PWxKXlh/\&#xD;
    zOGRjbsn4hb2HfccKkJE1FWKUrObEZV94RS9Zb21RMcx8xnfUYSw79cpVzQ7jaCMWJ0yTb\&#xD;
    ZZJbfKstJ5U2HHyYFNaJDCfFjYBKWvxMK8e/OOqT0gBGGzBo/\&#xD;
    fTj6d9ny6hPd8Ky243RtcrJqZbaoOVeG5mDdzdiv2Fvog/4jFadD4WK1krnUF4/\&#xD;
    t4CdlQyTbnACUPYdkzV45Rr+saLACeJomvp88RvACKiIuHaGc/8GefsiQ/\&#xD;
    J0Q4Xa1s7exCsYY8Fq6dY+xc3rFyvAmcY0jo+PX72c//acxDRJ4uP4d3ybz4+\&#xD;
    PXr2YjUGSb0AAIKHxkNoCx3caiEiZ5fl48OAiigLpMTzPhS1JITVoblkyaO8dNHFi86Lxw\&#xD;
    QBwO0ePd6zCbJPrZj0ZGocMdOmo4vfTYfhsGLKXEKNQrp1LN93GptCoXJh1sxcWPhd+\&#xD;
    axbRuJN/uLzjLvKp71LYCBDqJgoWcrJ8ex6oghF3xoumCXJZOxh8tGoJiJa1KVjGQ4gGK8\&#xD;
    R4YKq9K2qgg8T6pFC3oT6AeE7Qq6+mg/WKjr5nF+QxP0I5xHHzAJkkgy/ih1I4/odk1g/\&#xD;
    kvlu1fPX2ydjhMGdjkNk4bC9v9xtP3MBrRB2akTt89uT53YPEeZjAWyDGu/\&#xD;
    8oM01QaDjwso8nX9ghogLGpMeYrB6nLQGWwt+jGNe63pSKQ7ff4wDhR4/F/3V5vScyoS+\&#xD;
    PepW8IbYUakcq7pjELf8y0sgF7vQa8cvDGuniZ+OI/\&#xD;
    fKoUB4Dp7E6qkYd1Ugd1UPquDJCZWIreT6SSPUzEglBPeJDOqke0skhoDa7712KbB9b/\&#xD;
    wvgYaUk49M5FwbOLm12eB4r7Xiq9W2/G24oEg/whb/STKPPKgXz52jw6/Imz/\&#xD;
    3p5w8JH6IazaT6C7UuCKa9LYR70LritkTbtBCS4+VpYZ2ZNsl+\&#xD;
    JR5i5pOwzIk7fukvGY3W2ltBMK3DBYSQb71be6vKwbGBrl8BJ79PMw==&amp;#034;]][&amp;lt;|&#xD;
         &amp;#034;nbdir&amp;#034; -&amp;gt; StringReplace[nbd, &amp;#034;\\&amp;#034; -&amp;gt; &amp;#034;\\\\&amp;#034;]|&amp;gt;];&#xD;
      imgset = &#xD;
       StringTemplate[&#xD;
         Uncompress[&#xD;
          &amp;#034;1:eJyFVNtuGjEQzVv/\&#xD;
    onK3DywVcoAEmkbiIYqStFKSotJUlSjamPUArhbvyjYhqOq/\&#xD;
    d8beJUtoUwTClzNnzlw8b6b5l9HrVwcHb9kIHLsWy4INTV6AcQrsj3anm9HRgE2LDZfCCU\&#xD;
    57O24QsjGpANxtCkBUNLq7jbaHKwuJziVYvPlqVuAvtNO49fd0lzgD4Fe2uh43LpbKWpXr\&#xD;
    xoQrXawc+hshTs/\&#xD;
    dAo8kzMQqc8mDyFbktNvzlvn0Z10nbiFFy4iURpMKwrM8FQ65ERsftzqtfnN7ZXLnrxJYZ\&#xD;
    WAI0Ob9XqvN271et/O+1eEn/\&#xD;
    f7Rh463oF9IGgqTCLfgnNJzHwapSHPt4NFxm4IGnm7SDCxqf1ApiW5cDe8af4cGPu5UBsk\&#xD;
    jQjvdk/8DNy8BS+cWM5H5avSC/\&#xD;
    n8iqXJCq6VwIBMLIGsVDIGfSckk6FxZjPlFeZnYgLHj9qROvjXdI/\&#xD;
    6mYM3Osox9DgWs8pzhhpWlZ6lYghF0NRMpRc47tMHjegsEFLbAuV+\&#xD;
    EJrBOGIfBzTMqA0K4X5fmvLqowQ7RR73iXknZXswthGNrhXqnUHYCSC8sNwRiSrP91JTWp\&#xD;
    wRk+KHus4F4wC5FZuEFigdM3BRL/\&#xD;
    pxEzZjOHYuJbKEksGDu10mQ1tyC97zWq0De8sKiJ1gfyTKPicsTOkiCCci4uZO0nZw95et\&#xD;
    SacmWYBfhgYcV+\&#xD;
    htTZM8irD9cCocE0mQZDKKbi9HHUMBAe24Am5MJT83mJl8VdeWBhMOycJtESBl7msbw+\&#xD;
    uzTbXL2/WLUaLFqEgzidovRt+kDqvzvz5JK/9zQRNwOmAtyErSVt1xjxmgc3uD+\&#xD;
    irRFT8qHAmvhypRg67jFsxhqSQmgnTYpgvmAlb5CT0ORpPnKn9/\&#xD;
    PDHq395VHoiMA8RksDlC4TxZPLVFNvSqI3XG4gxp3Jwg0QiqhbezZ37G4c4Kjkh3Wybej0\&#xD;
    jfokp57hCNf2EXy67R9LH/zQs8jjhpx1niiZoUujNIunuUZ9u0hmu5QbUvj3xMWYUT/\&#xD;
    0aQaOjOciYXA1FJXmucsFQN1SmkQ/mK2NspBYnGmZgN6Eswb/AFsMV/A&amp;#034;]][&amp;lt;|&#xD;
         &amp;#034;frames&amp;#034; -&amp;gt; frames|&amp;gt;];&#xD;
      DeleteFile@FileNames[&amp;#034;_trash_*.png&amp;#034;];&#xD;
      glassRenderScript = pre &amp;lt;&amp;gt; mmaobj &amp;lt;&amp;gt; imgset;&#xD;
      fileName = &amp;#034;glassrender.py&amp;#034;;&#xD;
      file = OpenWrite[fileName];&#xD;
      WriteString[file, glassRenderScript];&#xD;
      Close[file];&#xD;
      outputfile = CreateFile[];&#xD;
      Run[&amp;#034;blender --background --python glassrender.py &amp;gt;&amp;gt;&amp;#034; &amp;lt;&amp;gt; outputfile];&#xD;
      stext = OpenRead[outputfile];&#xD;
      Close[stext];&#xD;
      DeleteFile[outputfile];&#xD;
      files = FileNames[&amp;#034;_trash_*.png&amp;#034;, nbdir];&#xD;
      imgs = Import[#] &amp;amp; /@ files;&#xD;
      imgs&#xD;
      ]&#xD;
    (* Bounding Box Related Info *)&#xD;
    bb[r_] := Module[{c, min, max, ext, temp},&#xD;
      c = Mean@Transpose@RegionBounds@r;&#xD;
      temp = Transpose@RegionBounds@r;&#xD;
      min = temp[[1]];&#xD;
      max = temp[[2]];&#xD;
      ext = (Differences@Transpose@RegionBounds@r)[[1]];&#xD;
      {c, min, max, ext}]&#xD;
    (* Use this function for MMA objects with sharp edges *)&#xD;
    impMMAobj[s_, mat_: &amp;#034;whiteGlass&amp;#034;] := &#xD;
     StringTemplate[&#xD;
       Uncompress[&#xD;
        &amp;#034;1:eJxdjk0OgjAQhXuUETewqT97DuDKBLcm0NYhlLRM006M3tBjSSG4cPcy3/\&#xD;
    teZqepuX2EEHu4+ECRgQeEjvQ4KY/d/\&#xD;
    Xg69zlBDcXvKhO7IiMd3pJCknZRW49pyKzsrcOgeKgTx7In98AIB1iGqiqLyrB94lWPaHi\&#xD;
    ezjuGJsYXyxW1tLL/8oa9YoxWuVnutrx8+wWRckmI&amp;#034;]][&amp;lt;|&amp;#034;objname&amp;#034; -&amp;gt; s, &#xD;
       &amp;#034;material&amp;#034; -&amp;gt; mat|&amp;gt;]&#xD;
    (* Use this function for additional smoothing applied in Blender *)&#xD;
    impMMAobjsmooth[s_, mat_: &amp;#034;whiteGlass&amp;#034;] := &#xD;
     StringTemplate[&#xD;
       Uncompress[&#xD;
        &amp;#034;1:eJxdjz0OwjAMRnsUU5Z2CT97D8CEBCtSmrauEpTUUWIhODK3oElVhNgsv+\&#xD;
    99ljcdXa7voii2cHKeAgNrhJa6+\&#xD;
    6Qctrf94TimCRoov1sR2ZYJdf4lyEdhsiodRp1YNRqLXrFuIodqJDtggB3korr+\&#xD;
    FedG7FlErQaU0RGxrnJA9WweeM54vp3yPU2MTxYLkosK/+\&#xD;
    EVO8UYjLKz3K5zfucD6o9VhQ==&amp;#034;]][&amp;lt;|&amp;#034;objname&amp;#034; -&amp;gt; s, &amp;#034;material&amp;#034; -&amp;gt; mat|&amp;gt;]&#xD;
    (* Directory Info *)&#xD;
    nbdir = NotebookDirectory[];&#xD;
    SetDirectory@nbdir;&#xD;
&#xD;
I found an implicit function that creates a nice heart shaped region in the documentation.  The following code will discretize the region, create an appropriately sized stage, assign a white glass material to the object and render the image in Blender.&#xD;
&#xD;
    (* Create Heart Shaped Region *)&#xD;
    drheart = &#xD;
      DiscretizeRegion[&#xD;
       ImplicitRegion[(x^2 + (9/4) y^2 + z^2 - 1)^3 - &#xD;
          x^2 z^3 - (9/80) y^2 z^3 == 0, {x, y, z}], &#xD;
       MaxCellMeasure -&amp;gt; 0.00005];&#xD;
    {c, min, max, ext} = bb[drheart];&#xD;
    (* Create a Stage *)&#xD;
    box = Cuboid[{-Max[ext], -Max[ext], &#xD;
        min[[3]] - ext[[3]]/10}, {Max[ext], Max[ext], min[[3]]}];&#xD;
    RegionPlot3D[{drheart, box}]&#xD;
    (* Export MMA objects as STL *)&#xD;
    Export[&amp;#034;heart.stl&amp;#034;, drheart];&#xD;
    Export[&amp;#034;box.stl&amp;#034;, box];&#xD;
    (* Render in Blender in the background *)&#xD;
    (* Default material is white glass *)&#xD;
    imgs = blenderworkflow[impMMAobjsmooth[&amp;#034;heart&amp;#034;]];&#xD;
    First@imgs&#xD;
&#xD;
[![Glass Heart][9]][10]&#xD;
&#xD;
The following changes the material to a shiny white metal.&#xD;
&#xD;
    (* Now render in white shiny metal *)&#xD;
    imgs = blenderworkflow[impMMAobjsmooth[&amp;#034;heart&amp;#034;, &amp;#034;whiteMetal&amp;#034;]];&#xD;
    First@imgs&#xD;
&#xD;
[![Steel heart][11]][12]&#xD;
&#xD;
You can also create a rotating stage animation by setting the frame parameter to something other than 1, but this can take a while to render.&#xD;
&#xD;
    (* Rotate the stage *)&#xD;
    (* Will take a long time *)&#xD;
    (* Default material is glass *)&#xD;
    imgs = blenderworkflow[impMMAobjsmooth[&amp;#034;heart&amp;#034;], 60];&#xD;
    First@imgs&#xD;
    ListAnimate[imgs]&#xD;
&#xD;
[![Rotating stage][13]][14]&#xD;
&#xD;
Finally, for objects with sharp features, one should not use smoothing as it will round the corners.  Here is an example with Spikey.&#xD;
&#xD;
    (* Create a glass spikey *)&#xD;
    poly = PolyhedronData[&amp;#034;Spikey&amp;#034;, &amp;#034;BoundaryMeshRegion&amp;#034;];&#xD;
    {c, min, max, ext} = bb[poly];&#xD;
    box = Cuboid[{-Max[ext], -Max[ext], &#xD;
        min[[3]] - ext[[3]]/10}, {Max[ext], Max[ext], min[[3]]}];&#xD;
    Export[&amp;#034;spikey.stl&amp;#034;, poly];&#xD;
    Export[&amp;#034;box.stl&amp;#034;, box];&#xD;
    RegionPlot3D[{poly, box}]&#xD;
    imgs = blenderworkflow[impMMAobj[&amp;#034;spikey&amp;#034;]];&#xD;
    First@imgs&#xD;
&#xD;
[![Glass Spikey][15]][16]&#xD;
&#xD;
&#xD;
----------&#xD;
&#xD;
*The original version of this post can be found [HERE][17].*&#xD;
&#xD;
&#xD;
  [1]: https://i.stack.imgur.com/6P2TD.png&#xD;
  [2]: https://i.stack.imgur.com/6P2TD.png&#xD;
  [3]: https://i.stack.imgur.com/pmC3X.png&#xD;
  [4]: https://i.stack.imgur.com/pmC3X.png&#xD;
  [5]: https://i.stack.imgur.com/TMySF.png&#xD;
  [6]: https://i.stack.imgur.com/TMySF.png&#xD;
  [7]: https://i.stack.imgur.com/rVKI3.png&#xD;
  [8]: https://i.stack.imgur.com/rVKI3.png&#xD;
  [9]: https://i.stack.imgur.com/znkw0.png&#xD;
  [10]: https://i.stack.imgur.com/znkw0.png&#xD;
  [11]: https://i.stack.imgur.com/5Zb3O.png&#xD;
  [12]: https://i.stack.imgur.com/5Zb3O.png&#xD;
  [13]: https://i.stack.imgur.com/MST7J.gif&#xD;
  [14]: https://i.stack.imgur.com/MST7J.gif&#xD;
  [15]: https://i.stack.imgur.com/SpCsy.png&#xD;
  [16]: https://i.stack.imgur.com/SpCsy.png&#xD;
  [17]: https://mathematica.stackexchange.com/a/202180</description>
    <dc:creator>Tim Laska</dc:creator>
    <dc:date>2019-10-04T15:22:32Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/1383237">
    <title>[WSC18] Simulating Auditory and Visual deficiencies</title>
    <link>https://community.wolfram.com/groups/-/m/t/1383237</link>
    <description>#What is this project?&#xD;
This project is intended to let you experience the world how other people see and hear it. I Especially explore various auditory and visual impairments and simulate their effects. I hope this tool can go on to help friends and families understand what their loved ones are going through. &#xD;
Overall I set out to create an interactive display of different sensory deficiencies, and I believe in part I achieved my goal. Although I would like to add more things I got some of the most common/impactful ailments! I used two main sources to acquire my data &#xD;
http://www.roger-russell.com/hearing/hearing.htm &#xD;
https://www.sciencedirect.com/science/article/pii/S167229301150008X?via%3Dihub#bb0005 &#xD;
&#xD;
#What does it look like?&#xD;
Below are instances of the simulation as well as rough regressions I made to supplement &#xD;
![Audio Distortion][1]![Visual Distortion][2]&#xD;
![Female Regression][3]![Male Regression][4]&#xD;
#How did I Create this project?&#xD;
In order to simulate frequency loss I used Fourier transforms isolate and lower (by a certain percentage) the decibel level of ranges of frequencies. To add tinnitus I had to play around with different functions in the audio space but eventually I was able to simplify it and go on to apply it. The bulk of my code is below it is the code which implements the Fourier transform&#xD;
```&#xD;
AgeAlter[audio_, age_, gender_] := &#xD;
 Module[{sampleRate, fromDezibel, amps, kernel, Data},&#xD;
  If[gender == &amp;#034;Male&amp;#034;, Data = ApproximateAgeMale[age], &#xD;
   Data = ApproximateAgeFemale[age]];&#xD;
  sampleRate = First@Values@Options[audio, SampleRate];&#xD;
  fromDezibel = x \[Function] Exp[-x/8.685889638065035`];&#xD;
  amps = MapAt[N@BlockMap[Mean, \[Pi] #/sampleRate, 2, 1] &amp;amp;, &#xD;
    MapAt[fromDezibel, Transpose[Data], {2}], {1}];&#xD;
  kernel = &#xD;
   LeastSquaresFilterKernel[amps, &#xD;
    First@Values@Options[audio, SampleRate]];&#xD;
  e = AudioLoudness[&#xD;
    Audio[Map[channel \[Function] ListConvolve[kernel, channel], &#xD;
      AudioData@audio], Sequence @@ Options[audio]]];&#xD;
  If[age &amp;lt; 0, &amp;#034;How Are you Alive???&amp;#034;, &#xD;
   If[age &amp;gt; 90, &#xD;
    &amp;#034;You&amp;#039;re either Dead or Deaf (Or you&amp;#039;re hearing is far too \&#xD;
negligible to simulate)&amp;#034;,&#xD;
    If[age &amp;lt;= 30, &#xD;
     Grid[{{ListLinePlot[AudioLoudness[audio], &#xD;
         PlotLabels -&amp;gt; &amp;#034;What you hear/would hear&amp;#034;, &#xD;
         PlotLabel -&amp;gt; &#xD;
          &amp;#034;Decibel level of orignal and modified audio clips&amp;#034;, &#xD;
         LabelStyle -&amp;gt; {Black, Bold}, &#xD;
         AxesLabel -&amp;gt; {&amp;#034;Seconds&amp;#034;, &amp;#034;Decibels&amp;#034;}, &#xD;
         ImageSize -&amp;gt; Large]}, {audio}}],&#xD;
     Grid[{{ListLinePlot[{e, &#xD;
          AudioLoudness[&#xD;
           AudioTrim[audio, (-1*(Duration[audio] * .88094189354))], &#xD;
           Alignment -&amp;gt; Center]}, &#xD;
         PlotLabels -&amp;gt; {&amp;#034;What you hear&amp;#034;, &amp;#034;What you would hear&amp;#034;},&#xD;
         PlotLabel -&amp;gt; &#xD;
          &amp;#034;Decibel level of original and modified audio clips&amp;#034;,&#xD;
         LabelStyle -&amp;gt;  {Black, Bold},&#xD;
         AxesLabel -&amp;gt; {&amp;#034;Seconds&amp;#034;, &amp;#034;Decibels&amp;#034;},&#xD;
         ImageSize -&amp;gt; Large]},&#xD;
   {Audio[&#xD;
         Map[channel \[Function] ListConvolve[kernel, channel], &#xD;
          AudioData@audio], Sequence @@ Options[audio]]}}]]]]]&#xD;
```&#xD;
&#xD;
This code essentially takes in an audio file and corresponding age, it then transforms the audio into Fourier space and maps decimal lowering amounts across different frequency ranges. Since decibels use a logarithmic scale i often struggled dealing with different units and converting between them although a professor helped me work it out. Other than this the rest was not too difficult to implement, the visual aspects were just a simple blurring of the image and extracting different RGB values (using built in functions).&#xD;
&#xD;
#In the future&#xD;
 in the future I intend to improve and grow this project, expanding its reach for many different deficiencies, both applying the auditory and visual senses and not applying. I also hope to use this project, as a demonstration, to help raise awareness for people afflicted with these ailments or if this project becomes solidified enough, to directly make an impact on those who need it.&#xD;
&#xD;
&#xD;
  [1]: http://community.wolfram.com//c/portal/getImageAttachment?filename=4967AudioModel.png&amp;amp;userId=1372232&#xD;
  [2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=10323VIsualModel.png&amp;amp;userId=1372232&#xD;
  [3]: http://community.wolfram.com//c/portal/getImageAttachment?filename=Female2.png&amp;amp;userId=1372232&#xD;
  [4]: http://community.wolfram.com//c/portal/getImageAttachment?filename=Male.png&amp;amp;userId=1372232</description>
    <dc:creator>Sartaj Gulati</dc:creator>
    <dc:date>2018-07-13T20:53:52Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/1379131">
    <title>[WSS18] Augmented &amp;amp; Virtual Reality implementation in Wolfram Language</title>
    <link>https://community.wolfram.com/groups/-/m/t/1379131</link>
    <description># Augmented and Virtual reality functions in Mathematica #&#xD;
&#xD;
## Overture ##&#xD;
&#xD;
The goal of my project at Wolfram Summer School &amp;#039;18 was to implement VR and AR visualization features in the Wolfram Language. Conceptually, the underlying idea was to create two simple functions through which deploy in a VR or AR environment any 3D object created from Mathematica, spanning from a random solid (e.g. created using the [ConvexHullMesh][1] function):&#xD;
&#xD;
    ConvexHullMesh[RandomReal[5, {50, 3}]]&#xD;
&#xD;
to a detailed view of 3D elevation data for a specific region:&#xD;
&#xD;
    ListPlot3D[GeoElevationData[GeoDisk[Entity[&amp;#034;City&amp;#034;, {&amp;#034;Tivoli&amp;#034;, &amp;#034;Lazio&amp;#034;, &amp;#034;Italy&amp;#034;}], 4 Quantity[1, &amp;#034;Kilometers&amp;#034;]], GeoZoomLevel -&amp;gt; 12]]&#xD;
&#xD;
## How ##&#xD;
The two functions ARDeploy and VRDeploy work in conjunction with the WebVR BoilerPlate platform and AR.js library, which provide the required JavaScript tools to manage the VR/AR environments. The objects representation is translated from Mathematica into a js compatible format by Three.js and Aframe.js libraries. &#xD;
In particular, VRDeploy enables to investigate abstract mathematical (and any 3D Mathematica models) representations in a new environment leading to a clearer and easier understanding of them. In combination with mobile devices, equipped with a headset, enables the user full immersion experience. &#xD;
ARDeploy, on the other side, enables the &amp;#034;on-screen&amp;#034; exploration of the same objects compatible in VRDeploy, attaching it to a marker which could be visualized in the Mathematica notebook. &#xD;
The great difference between the two functions relies on the way the 3D object is explored. The VRDeploy offers a static full immersive point of view while ARDeploy offers a dynamic exploration.&#xD;
&#xD;
These two functions (in form of Mathematica package) and all the required files can be downloaded at [https://github.com/DomenicoRomano/ARVRDeploy][2].&#xD;
&#xD;
## VRDeploy Example Code ##&#xD;
VRDeploy (as ARDeploy) works with any 3D surfaces from Plot3D or Example Data. In the example code below the 3D surface from [Plot3D][3] is translated in a VR object, VRDeploy returns a link to a webpage in which there is the WebVR environment with the Plot3D object visualised:&#xD;
&#xD;
    VRDeploy[Plot3D[0.5 *Sin[0.4*x y], {x, -4, 4}, {y, -4, 4}, MaxRecursion -&amp;gt; 4]]&#xD;
&#xD;
Below a screenshot view of the webpage returned by VRDeploy as seen from mobile devices, this provides the full immersion experience. It is also possible to switch to normal view and then navigate the VR scene using the mobile device orientation hardware.&#xD;
&#xD;
![Mobile device view of the full immersive experience][4]&#xD;
&#xD;
Below a view of the same webpage as above returned by VRDeploy as seen from a laptop/desktop computer:&#xD;
&#xD;
![Web page visualizing the output of the above VRDeploy line of code][5]&#xD;
&#xD;
Another example is the VR visualization of the stars distribution around the Earth, using the [StarData][6] function. The following code extracts the position of the 35000 closest stars to our Sun, then converts the distances to parsec and VRDeploy them:&#xD;
&#xD;
    VRDeploy[Graphics3D[&#xD;
      Sphere[QuantityMagnitude[&#xD;
        DeleteMissing[StarData[EntityClass[&amp;#034;Star&amp;#034;, &#xD;
         List[Rule[EntityProperty[&amp;#034;Star&amp;#034;, &amp;#034;DistanceFromEarth&amp;#034;], &#xD;
           TakeSmallest[35000]]]], &amp;#034;HelioCoordinates&amp;#034;]]/3.26],&#xD;
       .05], {&#xD;
      Axes -&amp;gt; {True, True, True}, &#xD;
       AxesLabel -&amp;gt; {&amp;#034;pc&amp;#034;, &amp;#034;pc&amp;#034;, &amp;#034;pc&amp;#034;}, ImageSize -&amp;gt; Large}]]&#xD;
&#xD;
Resulting in the picture below:&#xD;
&#xD;
![A full immersive view of the closest 35000 stars as seen from Earth, created with VRDeploy and the StarData function.][7]&#xD;
&#xD;
---&#xD;
##ARDeploy example##&#xD;
A simple example of ARDeploy function presented here use the Viking Lander model from the 3D Example data of Wolfram Mathematica, following is the code:&#xD;
&#xD;
    ARDeploy[ExampleData[{&amp;#034;Geometry3D&amp;#034;, &amp;#034;VikingLander&amp;#034;}]]&#xD;
&#xD;
The returned CloudObject is a link to a webpage, which can be easily converted to a QR code and scanned by mobile devices.&#xD;
&#xD;
    BarcodeImage[&#xD;
     First[ARDeploy[ExampleData[{&amp;#034;Geometry3D&amp;#034;, &amp;#034;VikingLander&amp;#034;}]]], &amp;#034;QR&amp;#034;]&#xD;
&#xD;
The webpage opened in the mobile device will ask &amp;#034;to allow the application to access the camera&amp;#034;. After granting the access to the camera the mobile device will recognize the marker, plotting on it the ARDeployed object:&#xD;
&#xD;
![The Viking Lander 3D model, taken from Mathematica Example data, visualized attached to a marker on a laptop screen.][8]&#xD;
&#xD;
## Future Directions ##&#xD;
&#xD;
Next steps require to enrich the visualization experience with interactive capabilities, displaying specific information or changing the object properties. &#xD;
Another required step is the marker independent placement of the object, using absolute GPS coordinates. The required mobile hardware improvements will be soon available. &#xD;
It will be also interesting to offer a virtual reality Mathematica notebook in which the user could experience the advantages to learn in a full-immersive three-dimensional virtual world.&#xD;
&#xD;
&#xD;
  [1]: http://reference.wolfram.com/language/ref/ConvexHullMesh.html&#xD;
  [2]: https://github.com/DomenicoRomano/ARVRDeploy&#xD;
  [3]: http://reference.wolfram.com/language/ref/Plot3D.html?q=Plot3D&#xD;
  [4]: http://community.wolfram.com//c/portal/getImageAttachment?filename=IMG_0020.PNG&amp;amp;userId=1362723&#xD;
  [5]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2018-07-10at21.47.34.png&amp;amp;userId=1362723&#xD;
  [6]: http://reference.wolfram.com/language/ref/StarData.html?q=StarData&#xD;
  [7]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2018-07-10at15.17.01.png&amp;amp;userId=1362723&#xD;
  [8]: http://community.wolfram.com//c/portal/getImageAttachment?filename=IMG_0012.jpg&amp;amp;userId=1362723</description>
    <dc:creator>Romano Domenico</dc:creator>
    <dc:date>2018-07-11T18:57:54Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/1369571">
    <title>How does a neural network that only knows beauty interpret the world?</title>
    <link>https://community.wolfram.com/groups/-/m/t/1369571</link>
    <description>I recently came across a video that intended to show how neural networks interpret images of (not so beautiful) things if they have only been trained on beautiful things. It is quite a nice question, I think. [Here is a website describing the technique][1], and [here is a video that illustrates the idea][2]. In this post I will show you, how to generate similar effects easily with the Wolfram Language:&#xD;
&#xD;
![enter image description here][3]&#xD;
&#xD;
and in video format:&#xD;
&#xD;
![enter image description here][4]&#xD;
&#xD;
On the right you see the &amp;#034;interpretation&amp;#034; of a neural network that has been shown lots of photos of flowers, when it actually looks at a rubbish dump with a couple of birds sitting on the rubbish. &#xD;
&#xD;
Devising a plan&#xD;
----------&#xD;
&#xD;
We will need a training dataset and should hope to find a network in the [Wolfram Neural Net Repository][5] that more or less does what we want to do. If you have watched some of the [excellent training videos on neural nets][6] offered by Wolfram you will have noticed that the general suggestion is not to develop your own neural networks from scratch, but rather use what is already there and perhaps combine it, or adapt it so that you can achieve what you want. This is also very well described in this recent [blog-post by experts on the topic][7]. I am usually happy if I can use the work of others and do not have to re-invent the wheel. &#xD;
&#xD;
If you read the posts describing how to build a network that have only seen beautiful things, you will find that they used a variation of the pix2pix network and an implementation in tensorflow (a &amp;#034;conditional adversarial network&amp;#034;).  If you go through the extensive list of networks that are offered on the Wolfram Neural Net Repository you will see that there are Pix2pix resources, e.g. &#xD;
&#xD;
    ResourceObject[&amp;#034;Pix2pix Photo-To-Street-Map Translation&amp;#034;]&#xD;
&#xD;
or &#xD;
&#xD;
    net=ResourceObject[&amp;#034;Pix2pix Street-Map-To-Photo Translation&amp;#034;]&#xD;
&#xD;
I will use the latter resource object, but that does not actually matter. Next, we will need to build a training set. &#xD;
&#xD;
Scraping data for the training set&#xD;
----------&#xD;
&#xD;
The next thing we need is a solid training set. My first attempt was to use ServiceConnect with the google search to obtain lots of images of flowers. &#xD;
&#xD;
    googleCS = ServiceConnect[&amp;#034;GoogleCustomSearch&amp;#034;]&#xD;
    imgs = ServiceExecute[&amp;#034;GoogleCustomSearch&amp;#034;, &#xD;
       &amp;#034;Search&amp;#034;, {&amp;#034;Query&amp;#034; -&amp;gt; &amp;#034;Flowers&amp;#034;, MaxItems -&amp;gt; 1000, &#xD;
        &amp;#034;SearchType&amp;#034; -&amp;gt; &amp;#034;Image&amp;#034;}];&#xD;
&#xD;
It turns out that the max of results returned is only 100, which is not enough for our purpose. I tried to fix this by using&#xD;
&#xD;
    imgs2 = ServiceExecute[&amp;#034;GoogleCustomSearch&amp;#034;, &#xD;
       &amp;#034;Search&amp;#034;, {&amp;#034;Query&amp;#034; -&amp;gt; &amp;#034;Flowers&amp;#034;, MaxItems -&amp;gt; 1000, &#xD;
        &amp;#034;StartIndex&amp;#034; -&amp;gt; 101, &amp;#034;SearchType&amp;#034; -&amp;gt; &amp;#034;Image&amp;#034;}];&#xD;
&#xD;
but that did not work. So WebImageSearch is the way to go. It does cost ServiceCredits, but the costs are relatively limited. Let&amp;#039;s download information on 1000 images of flowers:&#xD;
&#xD;
    imgswebsearch = WebImageSearch[&amp;#034;Flowers&amp;#034;, MaxItems -&amp;gt; 1000];&#xD;
    Export[&amp;#034;~/Desktop/imglinks.mx&amp;#034;, imgswebsearch]&#xD;
&#xD;
A WebImageSearch of up to 10 results costs 3 ServiceCredits. So this should be 300 credits. 500 credits can be bought for $3, and 5000 for $25 (+VAT). This would mean the generation of our training set comes in at maximally $1.8, which is manageable - particularly if we consider the price of the eGPU that we will use later on.... Just in case we export the result, because we paid for it and might have to recover it later if we suffer a kernel crash or something. &#xD;
&#xD;
Alright. Now we have a dataset that looks more or less like this:&#xD;
&#xD;
![enter image description here][8]&#xD;
&#xD;
Great. That contains the &amp;#034;ImageHyperlink&amp;#034; which we will now use to download all the images:&#xD;
&#xD;
    rawimgs = Import /@ (&amp;#034;ImageHyperlink&amp;#034; /. Normal[imgswebsearch]);&#xD;
    Export[&amp;#034;~/Desktop/rawimgs.mx&amp;#034;, rawimgs]&#xD;
&#xD;
Again, we export the result (better safe than sorry!). Let&amp;#039;s make the images conform:&#xD;
&#xD;
    imagesconform = ConformImages[Select[rawimgs, ImageQ]];&#xD;
&#xD;
By using Select[...,ImageQ] we make sure that we use only images; and not error messages of the cases where it didn&amp;#039;t work. &#xD;
&#xD;
Generating a training set&#xD;
----------&#xD;
&#xD;
&#xD;
In the original posts they suggest that they used edges, i.e. EdgeDetect to generate partial information of the images, and then linked that to the full image like so:&#xD;
&#xD;
    rules = ImageAdjust[EdgeDetect[ImageAdjust[#]]] -&amp;gt; # &amp;amp; /@ imagesconform;&#xD;
&#xD;
It turns out that my results with that were less than impressive so I went for a more time consuming approach that gave better results. I used &#xD;
&#xD;
    Monitor[rulesnew = Table[Colorize[ClusteringComponents[rules[[i, 2]], 7]] -&amp;gt; rules[[i, 2]], {i, 1, Length[rules]}];, I]&#xD;
&#xD;
i.e. ClusteringComponents to generate a trainingset. Partial information on the images now looked like this:&#xD;
&#xD;
![enter image description here][9]&#xD;
&#xD;
rather than &#xD;
&#xD;
![enter image description here][10]&#xD;
&#xD;
when we use EdgeDetect. Our training data set now links the image with partial (CluseteringComponents) information via a rule to the original image. Basically, we give partial information of the world and train the network to see flowers. Just in case we export the data set like so:&#xD;
&#xD;
    Export[&amp;#034;~/Desktop/rulesnew.mx&amp;#034;, rulesnew]&#xD;
&#xD;
Training the network&#xD;
----------&#xD;
&#xD;
If you want to train on the EdgeDetect version you can use:&#xD;
&#xD;
    retrainednet = NetTrain[net, rules, TargetDevice -&amp;gt; &amp;#034;GPU&amp;#034;, TrainingProgressReporting -&amp;gt; &amp;#034;Panel&amp;#034;, TimeGoal -&amp;gt; Quantity[120, &amp;#034;Minutes&amp;#034;]]&#xD;
&#xD;
otherwise you can use&#xD;
&#xD;
    retrainednet2 = NetTrain[net, rulesnew, TargetDevice -&amp;gt; &amp;#034;GPU&amp;#034;, TrainingProgressReporting -&amp;gt; &amp;#034;Panel&amp;#034;, TimeGoal -&amp;gt; Quantity[120, &amp;#034;Minutes&amp;#034;]]&#xD;
&#xD;
Note that I use a GPU and considerable training time (2h). On a CPU this would take quite a while. Here are typical results of the EdgeDetect network:&#xD;
&#xD;
    retrainednet[EdgeDetect[CurrentImage[], 0.7]]&#xD;
&#xD;
![enter image description here][11]&#xD;
&#xD;
and the ClusteringComponents one:&#xD;
&#xD;
![enter image description here][12]&#xD;
&#xD;
We should not forget to export the network:&#xD;
&#xD;
    Export[&amp;#034;~/Desktop/teachnwnicethings2.wlnet&amp;#034;, retrainednet2]&#xD;
&#xD;
More examples&#xD;
----------&#xD;
&#xD;
Let&amp;#039;s look at the ClusteringComponents network a bit closer. We apply &#xD;
&#xD;
    GraphicsRow[{ImageResize[#, {256, 256}], beautifulnet[Colorize[ClusteringComponents[#, 7]]]}] &amp;amp;&#xD;
&#xD;
to different images to obtain:&#xD;
&#xD;
![enter image description here][13]&#xD;
&#xD;
Application to videos&#xD;
&#xD;
----------&#xD;
&#xD;
Suppose that I have the frames of a recorded movie stored in the variable movie1. Then load our network into the function beautifulnet&#xD;
&#xD;
    beautifulnet = Import[&amp;#034;/Users/thiel/Desktop/teachnwnicethings2.wlnet&amp;#034;]&#xD;
&#xD;
Then the following will generate frames for an animation:&#xD;
&#xD;
    animation1 = GraphicsRow[{ImageResize[#, {256, 256}], beautifulnet[Colorize[ClusteringComponents[#, 7]]]}] &amp;amp; /@ movie1;&#xD;
&#xD;
We can animate this like so:&#xD;
&#xD;
    ListAnimate[animation1]&#xD;
&#xD;
![enter image description here][14]&#xD;
&#xD;
Conclusion&#xD;
----------&#xD;
&#xD;
These are only very preliminary results. But we see the workflow from scraping data, via generating a training set, choosing a network and training it. I think that having more images and more training, perhaps a small change of the net might give us much better results.  The video is quite bad, because we should use a better object than &amp;#034;four cables in a hand&amp;#034;. It is also a bit debatable whether it is ok to say that this is how a network that only has seen beautiful things interprets the world, but I couldn&amp;#039;t resit the hype. Sorry for that!&#xD;
&#xD;
This is certainly in the realm of &amp;#034;recreational use of the Wolfram Language&amp;#034;, but the network does appear to make the world more colourful and provides a very special interpretation of the world. I hope that people in this forum who are better than me at this ([@Sebastian Bodenstein][at0] , [@Matteo Salvarezza][at1] , [@Meghan Rieu-Werden][at2] , [@Vitaliy Kaurov][at3] ?) can improve on the results. &#xD;
&#xD;
Cheers,&#xD;
&#xD;
Marco&#xD;
&#xD;
 [at0]: http://community.wolfram.com/web/sebastianb&#xD;
&#xD;
 [at1]: http://community.wolfram.com/web/matteosalvarezza&#xD;
&#xD;
 [at2]: http://community.wolfram.com/web/meghanr&#xD;
&#xD;
 [at3]: http://community.wolfram.com/web/vitaliyk&#xD;
&#xD;
&#xD;
  [1]: http://www.memo.tv/learning-to-see-you-are-what-you-see/&#xD;
  [2]: https://vimeo.com/260612034&#xD;
  [3]: http://community.wolfram.com//c/portal/getImageAttachment?filename=Screenshot2018-07-0600.33.06.png&amp;amp;userId=48754&#xD;
  [4]: http://community.wolfram.com//c/portal/getImageAttachment?filename=1838animated.gif&amp;amp;userId=48754&#xD;
  [5]: https://resources.wolframcloud.com/NeuralNetRepository/&#xD;
  [6]: http://www.wolfram.com/broadcast/s?sx=neural%20networks&amp;amp;c=105&amp;amp;v=1522&#xD;
  [7]: http://blog.wolfram.com/2018/06/14/launching-the-wolfram-neural-net-repository/&#xD;
  [8]: http://community.wolfram.com//c/portal/getImageAttachment?filename=Screenshot2018-07-0523.04.32.png&amp;amp;userId=48754&#xD;
  [9]: http://community.wolfram.com//c/portal/getImageAttachment?filename=Screenshot2018-07-0523.13.54.png&amp;amp;userId=48754&#xD;
  [10]: http://community.wolfram.com//c/portal/getImageAttachment?filename=Screenshot2018-07-0523.15.48.png&amp;amp;userId=48754&#xD;
  [11]: http://community.wolfram.com//c/portal/getImageAttachment?filename=Screenshot2018-07-0523.21.05.png&amp;amp;userId=48754&#xD;
  [12]: http://community.wolfram.com//c/portal/getImageAttachment?filename=Screenshot2018-07-0523.22.34.png&amp;amp;userId=48754&#xD;
  [13]: http://community.wolfram.com//c/portal/getImageAttachment?filename=resulttable.png&amp;amp;userId=48754&#xD;
  [14]: http://community.wolfram.com//c/portal/getImageAttachment?filename=1838animated.gif&amp;amp;userId=48754</description>
    <dc:creator>Marco Thiel</dc:creator>
    <dc:date>2018-07-05T23:14:51Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/1305281">
    <title>Basketball Boids</title>
    <link>https://community.wolfram.com/groups/-/m/t/1305281</link>
    <description>Since it is basketball season (right now we are in the middle of March madness) I thought it&amp;#039;d be fun to make a basketball simulation.  Basketball Boids is motivated by the boids model of bird flocks, with a term for separation from teammates, a term for cover/separation from the other team (sign depends on offense or defense), an attraction towards the basket.  You can try your own parameters, and you can try other models.  Ideally, there would be so-called emergent properties like team work and creativity as in the bird flocks.&#xD;
&#xD;
![basketball boids][1]&#xD;
&#xD;
Blue is offense and red is defense.&#xD;
&#xD;
    m = Through[{m1, m2, m3, m4, m5}[t]];(*{m1[t],m2[t],m3[t],m4[t],m5[t]}*)&#xD;
    p = Through[{p1, p2, p3, p4, p5}[t]];(*{p1[t],p2[t],p3[t],p4[t],p5[t]}*)&#xD;
&#xD;
Those are the variables and this solves the differential equation, with the boid parameters in the code.  The separation and cover terms are inverse square forces, but the hoop term is just radial.&#xD;
&#xD;
    dm[n_] :=  separation*   Sum[(m[[n]] -  m[[i]])/((m[[n]] - m[[i]]).(m[[n]] - m[[i]]))^(3/2), {i, 5}] +  cover*Sum[(m[[n]] - &#xD;
           p[[i]])/((m[[n]] - p[[i]]).(m[[n]] - p[[i]]))^(3/2), {i, 5}] + hoop*(-m[[n]]/(m[[n]].m[[n]])^(1/2))&#xD;
    &#xD;
    dp[n_] := separation*Sum[(p[[n]] -  p[[i]])/((p[[n]] - p[[i]]).(p[[n]] - p[[i]]))^(3/2), {i, 5}] - cover*Sum[(p[[n]] - &#xD;
           m[[i]])/((p[[n]] - m[[i]]).(p[[n]] - m[[i]]))^(3/2), {i, 5}] +   hoop*(-p[[n]]/(p[[n]].p[[n]])^(1/2))&#xD;
&#xD;
    params={separation -&amp;gt; 1,  cover -&amp;gt; 10, hoop -&amp;gt; 100};&#xD;
&#xD;
    sol = Quiet[NDSolve[Evaluate[N@Flatten[Table[{D[m[[i]], t, t] == dm[i], &#xD;
              D[p[[i]], t, t] == dp[i], (D[m[[i]], t] /. t -&amp;gt; 0) == {0, 0}, (D[p[[i]], t] /. t -&amp;gt; 0) == {0, 0}, (m[[i]] /. t -&amp;gt; 0) == RandomReal[1, 2], (p[[i]] /. t -&amp;gt; 0) == RandomReal[1, 2]}, {i, 5}]]/. params],Join[m, p], {t, 0, 1}][[1]]];&#xD;
              &#xD;
&#xD;
    ParametricPlot[Evaluate[Join[m, p] /. sol], {t, 0, 1},  PlotStyle -&amp;gt; Join[Table[Blue, 5], Table[Red, 5]]]&#xD;
![parametric plot of a solution][2]&#xD;
&#xD;
The animation above was made with&#xD;
&#xD;
    Animate[ListPlot[Join[Style[#, Blue] &amp;amp; /@ m, Style[#, Red] &amp;amp; /@ p] /. sol /. t -&amp;gt; s, &#xD;
      Axes -&amp;gt; False, Prolog -&amp;gt; {Brown, Disk[{0, 0}, .2]},  AspectRatio -&amp;gt; 1, PlotRange -&amp;gt; 4 {{-1, 1}, {-1, 1}}], {s, 0, 1, .01}]&#xD;
&#xD;
(I remember talking several years ago to a student at the [Wolfram High School Summer Program][3] about Boids, which is why I was thinking about them.)&#xD;
&#xD;
&#xD;
  [1]: http://community.wolfram.com//c/portal/getImageAttachment?filename=basketball-boids.gif&amp;amp;userId=23275&#xD;
  [2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=basketball-boids.png&amp;amp;userId=23275&#xD;
  [3]: https://education.wolfram.com/summer/camp/</description>
    <dc:creator>Todd Rowland</dc:creator>
    <dc:date>2018-03-20T14:23:53Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/879610">
    <title>Wolfram|Alpha for HTC Vive or other VR devices?</title>
    <link>https://community.wolfram.com/groups/-/m/t/879610</link>
    <description>I had an idea while waiting for my own HTC Vive [(website)][1] to arrive regarding Wolfram|Alpha. If it could be implemented as a program for the Vive, it could make 3D plots (both across real numbers and the complex plane) become easily visualized, especially with it&amp;#039;s &amp;#034;Room Scale&amp;#034; capabilities. I know that the Vive was really made with games in mind, but I think that it could be also useful when used with Wolfram|Alpha. I&amp;#039;m not sure if this is the right place to post this idea, but hopefully the Wolfram|Alpha team will see this and consider it. Please share with me your thoughts, and hope this is seen by Wolfram|Alpha developers so they can see your ideas too.&#xD;
&#xD;
&#xD;
  [1]: http://www.htcvive.com</description>
    <dc:creator>Caelum Codicem</dc:creator>
    <dc:date>2016-06-30T05:13:39Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/772917">
    <title>A Brief History of Noise, The Atlantic, 1/5/16</title>
    <link>https://community.wolfram.com/groups/-/m/t/772917</link>
    <description>On January 5th, 2016  *The Atlantic* published an article by Rose Eveleth entitled &amp;#034;[A Brief History of Noise:From the big bang to cellphones][1]&amp;#034; featuring my father, nuclear physicist [John G. Cramer][2] who used Mathematica to recreate a simulation of the sound of the Big Bang. [He first did this in 2003][3], and then [did it again in 2013 using new data][4]. (There is also [a version on YouTube featuring graphics from the LISA Mission][5].) &#xD;
&#xD;
[Here is a link to an archive of the initial Mathematica][6] notebook in the Wolfram Library Archive.&#xD;
&#xD;
UPDATE: I asked him if there is a newer version of the file than the one in the archive. The most recent BBSound notebook, used for the Planck data, is at:&#xD;
    [http://faculty.washington.edu/jcramer/BigBang/Planck_2013/BB_Sound_3.nb][7]&#xD;
and the Planck data file is at&#xD;
    [http://faculty.washington.edu/jcramer/BigBang/Planck_2013/PlanckData.txt][8]&#xD;
&#xD;
&#xD;
  [1]: http://www.theatlantic.com/science/archive/2016/01/a-brief-history-of-noise/422481/&#xD;
  [2]: https://faculty.washington.edu/jcramer/&#xD;
  [3]: http://www.wolfram.com/mathematica/customer-stories/mathematica-simulates-the-sound-of-the-big-bang.html&#xD;
  [4]: http://www.washington.edu/news/2013/04/04/listening-to-the-big-bang-in-high-fidelity-audio/&#xD;
  [5]: https://youtu.be/1OpNI5DjxC0&#xD;
  [6]: http://library.wolfram.com/infocenter/MathSource/5083&#xD;
  [7]: http://faculty.washington.edu/jcramer/BigBang/Planck_2013/BB_Sound_3.nb&#xD;
  [8]: http://faculty.washington.edu/jcramer/BigBang/Planck_2013/PlanckData.txt</description>
    <dc:creator>Kathryn Cramer</dc:creator>
    <dc:date>2016-01-13T16:47:17Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/908742">
    <title>[WSSA16] Hand Gesture &amp;amp; Sign Language Recognition</title>
    <link>https://community.wolfram.com/groups/-/m/t/908742</link>
    <description>## Abstract&#xD;
&#xD;
This Summer School Wolfram Project represents an image-based approach for gesture recognition with the use of machine learning and neural network training.&#xD;
&#xD;
Up to now gesture recognition remains a complex problem solved with different techniques, each of them has its pron and cons. It has a wide range of applications: from sign language recognition to virtual reality simulation. Sign language is not universal and almost each country has its unique, although they are quite similar in many ways.&#xD;
&#xD;
_American sign language_&#xD;
&#xD;
![American sign language][1]&#xD;
&#xD;
Two methods of recognition and 4 different training sets are used. All gestures in this work come from American and Polish sign languages.&#xD;
&#xD;
- First method is to classify data via **Classify** function. It was implemented for a complex data-set of images, captured in an uncontrolled environment, and showed insufficient result with accuracy less than 3%. The reason is that image database contains full body images and it&amp;#039;s hard to distinguish hands only.&#xD;
&#xD;
- Second method involves neural network training via **NetTrain**. The dataset includes 3477 images for 5 gestures of ASL. Training is tested on two sets of images with uniform and non-uniform backgrounds. This method produced 76-77% accuracy result and it can be further improved up to 100% for static gestures by enlarging datasets.&#xD;
____&#xD;
## Classify Function&#xD;
&#xD;
[First dataset][3] contains 899 images of 12 different people for 27 gestures from Polish sign language (set1). It is complemented by 2072 full body images of 20 different people (set 2) and images captured from one person (set 3). &#xD;
&#xD;
![enter image description here][4]&#xD;
&#xD;
Test sets are generated from selection of original training sets. Several combinations of training and validation data selections were testified:&#xD;
&#xD;
![enter image description here][5]&#xD;
&#xD;
None of big data-sets gave sufficient result. Nevertheless, **Classify** can produce sufficient results with carefully sorted data.&#xD;
&#xD;
![enter image description here][6]&#xD;
&#xD;
___&#xD;
## NetTrain&#xD;
[Training][7] and [validation][8] database for the network training includes images of only hands. There are 3477 images in training set, 317 test images with uniform background and 317 images with complex background. Table below shows how many times each gesture is represented.&#xD;
&#xD;
![enter image description here][9]&#xD;
&#xD;
Training is proceeded with a 13-layer setup. Resulting accuracy for uniform background validation data is about 77%, and  about 75% for complex background data .&#xD;
&#xD;
![enter image description here][10]&#xD;
&#xD;
    cm = ClassifierMeasurements[lenet, uniTestData];&#xD;
    cm[&amp;#034;Accuracy&amp;#034;]&#xD;
    0.766562&#xD;
&#xD;
Confusion matrix is presented below.&#xD;
&#xD;
![enter image description here][11]&#xD;
&#xD;
___&#xD;
##Examples&#xD;
&#xD;
![enter image description here][12]&#xD;
&#xD;
	img&#xD;
&#xD;
![enter image description here][13]&#xD;
&#xD;
	lenet[img,&amp;#034;TopProbabilities&amp;#034;]&#xD;
	{&amp;#034;b&amp;#034; -&amp;gt; 0.981646}&#xD;
&#xD;
___&#xD;
## Conclusion&#xD;
&#xD;
- Sufficient identification requires differentiation of human body and background, as well as of different parts of body. &#xD;
&#xD;
- In order to improve recognition results, database enlargement together with image pre-processing should be done. Feature extraction algorithms would allow to remove background and to distinguish hands on an image.&#xD;
&#xD;
- Preferable method to use is neural network training, as it confidently wins over Classify Function in that case.&#xD;
&#xD;
  [1]: http://community.wolfram.com//c/portal/getImageAttachment?filename=18kzh07ccj6zkjpg.jpg&amp;amp;userId=900559&#xD;
  [2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=8525polish.png&amp;amp;userId=900559&#xD;
  [3]: http://sun.aei.polsl.pl/~mkawulok/gestures/hgr1_images.zip&#xD;
  [4]: http://community.wolfram.com//c/portal/getImageAttachment?filename=test.gif&amp;amp;userId=900559&#xD;
  [5]: http://community.wolfram.com//c/portal/getImageAttachment?filename=trainSet6.png&amp;amp;userId=900559&#xD;
  [6]: http://community.wolfram.com//c/portal/getImageAttachment?filename=pasted_image_at_2016_08_19_02_21_pm.png&amp;amp;userId=900559&#xD;
  [7]: http://www.idiap.ch/resource/gestures/data/shp_marcel_train.tar.gz&#xD;
  [8]: http://www.idiap.ch/resource/gestures/data/shp_marcel_test.tar.gz&#xD;
  [9]: http://community.wolfram.com//c/portal/getImageAttachment?filename=trainSet.png&amp;amp;userId=900559&#xD;
  [10]: http://community.wolfram.com//c/portal/getImageAttachment?filename=Capture.PNG&amp;amp;userId=900559&#xD;
  [11]: http://community.wolfram.com//c/portal/getImageAttachment?filename=trainSet7.png&amp;amp;userId=900559&#xD;
  [12]: http://community.wolfram.com//c/portal/getImageAttachment?filename=Capture2.PNG&amp;amp;userId=900559&#xD;
  [13]: http://community.wolfram.com//c/portal/getImageAttachment?filename=no.jpg&amp;amp;userId=900559</description>
    <dc:creator>Lev Dushkin</dc:creator>
    <dc:date>2016-08-19T14:31:12Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/814988">
    <title>Implementing Minecraft in Wolfram Language</title>
    <link>https://community.wolfram.com/groups/-/m/t/814988</link>
    <description>![enter image description here][1]&#xD;
&#xD;
Some time ago I asked myself: with all these great graphics and interactive capabilities of _Mathematica_, what kinds of 3D games can be implemented in it?  And the answer which came to mind is [Minecraft classic][2]. The scene in this game is almost static, the [first person view][3] and [controls][4] can be easily implemented, the terrain textures are [freely available][5], and the overall functionality does not seem to be complicated... There is even a related [demonstration][6] on the Demonstrations Project!&#xD;
&#xD;
&#xD;
----------&#xD;
&#xD;
&#xD;
**So, can Mathematica really handle Minecraft classic game functionality? Well, the answer seems to be YES :)**&#xD;
&#xD;
&#xD;
----------&#xD;
&#xD;
&#xD;
Here is my implementation of Minecraft classic game in Mathematica. Let&amp;#039;s start with some screenshots which were taken during the construction of the final scene which will be displayed an the end of this post.&#xD;
&#xD;
![enter image description here][7]&#xD;
&#xD;
![enter image description here][8]&#xD;
&#xD;
![enter image description here][9]&#xD;
&#xD;
![enter image description here][10]&#xD;
&#xD;
&#xD;
**Features**&#xD;
&#xD;
-	Blocks are creatable and removable&#xD;
-	One texture per block &#xD;
-	Player automatically jumps to the obstacles of one block height and on the blocks which are created directly underneath. You can also try to fall down.&#xD;
-	Simplified selection tracking, which can miss cube corners, is implemented. Anyway it is still quite intuitive and allows to put blocks diagonally.&#xD;
-	Big action range: you can place and remove blocks located far away. &#xD;
&#xD;
&#xD;
**Controls**&#xD;
&#xD;
-	W-A-S-D: move forward-left-backward-right. By default double steps are used, Shift key enables single step. &#xD;
-	Arrow keys: look up-down-left-right&#xD;
-	Mouse selects current block&#xD;
-	19: select new block type&#xD;
-	B: show blocks selector&#xD;
-	Left mouse click: delete block&#xD;
-	Right mouse click or Space: create block&#xD;
-	R: set respawn position&#xD;
-	Enter: respawn&#xD;
-	X: Save game state&#xD;
-	L: Load game state&#xD;
&#xD;
&#xD;
**Performance tuning**&#xD;
&#xD;
**Terrain construction.** Simple random walk terrain generation is implemented. The following parameters can by adjusted:&#xD;
&#xD;
`prmTERRAINBLOCKSN`  approximate number of terrain blocks&#xD;
&#xD;
`prmCLOUDSN`  number of clouds. Each cloud consists of random number of blocks.&#xD;
&#xD;
`prmTERRAINGRAIN` and `prmTERRAINOFFSET` control the landscape properties. On the first picture below `prmTERRAINOFFSET` is 8, and is 3 for the second one. &#xD;
&#xD;
![enter image description here][11]![enter image description here][12]&#xD;
&#xD;
**Hardware issues.** On some systems the presence of a _single_ opacity directive drastically decreases performance. In this case one can set `prmDISABLETRANSPARENCY` to `True` or/and try to use `prmRENDERINGENGINE=BSPTree`.&#xD;
&#xD;
**Conclusion**&#xD;
&#xD;
 To be honest I am myself surprised how well the final code performs. On average system it easily handles thousands and even tens of thousands blocks. With the growth of this number the &amp;#034;gameplay&amp;#034; becomes too slow.  It is also should be noted that what really matters is the number of faces, because hidden faces are not included in the final Graphics3D, so clustered blocks are preferable. &#xD;
&#xD;
The result of my first construction session is&#xD;
&#xD;
![enter image description here][13]&#xD;
&#xD;
&#xD;
The code has been tested on _Mathematica_ version 8.0.4 and WinXP and Win7 operation systems. Further improvements are appreciated as well as the comments about the code organisation and style. Thank you!&#xD;
&#xD;
I would like to make some considerations about the performance of the above code ([here is the link][14] to it on Pastebin for convenience).&#xD;
&#xD;
The speed of current implementation is both hardware and version specific. There are two main characteristics: the smoothness of motion and the speed of scene update after block creation or removal. I guess that the smoothness depends on graphics card, and the update speed on both the processor and GPU.&#xD;
&#xD;
On my working G860 3Hz Intel processor with integrated Intel HD graphics and Win7 OS I have the following:&#xD;
&#xD;
- The default scene with 5000 terrain blocks and 3 clouds is smooth enough for comfortable movement _with disabled transparency_. It speeds up a bit when no dynamic selection is displayed (when I point to the sky).&#xD;
&#xD;
- The update speed is approximately 1-2 seconds per operation.&#xD;
&#xD;
- The overall performance deteriorates significantly with enabled transparency. One way out is to set `prmCLOUDSN=0` since it is clouds what is transparent on the default scene.&#xD;
&#xD;
- With 20 000 terrain blocks and disabled opacity the movement is still pretty smooth, the scene update takes 2-3 seconds.&#xD;
&#xD;
- No performance differences between versions 8 and 9 on my system.&#xD;
&#xD;
- The scene I&amp;#039;ve constructed began with 1000 terrain blocks, no clouds (I added them manually) and no transparency. With this inital settings the scene updates momentally and is comfortable for construction. Honestly I am not sure about how transparency is handled on different systems, but on the system with an old discrete GeForce card it seemed to work faster than on integrated GPUs.&#xD;
&#xD;
So my advices on performance improvement still are:&#xD;
&#xD;
- Use `prmTERRAINBLOCKSN` and `prmCLOUDSN` wisely. Try to set them to zero at all.&#xD;
&#xD;
- Try `prmDISABLETRANSPARENCY=True`&#xD;
&#xD;
- Point to the sky when move&#xD;
&#xD;
---&#xD;
&#xD;
&#xD;
----------&#xD;
*NOTE: this article was originally posted at [Mathematica Stack Exchange][15].*&#xD;
&#xD;
----------&#xD;
&#xD;
&#xD;
This scene can be downloaded (and loaded from _Mathematica_) from [here][16]. Here is the code.&#xD;
&#xD;
&#xD;
    prmWORLDWIDTH = 200;&#xD;
    prmWORLDHEIGHT = 100;&#xD;
    prmVIEWERHEIGHT = 2.75;&#xD;
    prmVIEWRANGE = {0.01, 300};&#xD;
    prmMOVESTEP = .95;&#xD;
    prmACTIONRANGE = 300;&#xD;
    prmTRACESTEP = 0.33;&#xD;
    prmVIEWANGLE = 45 Degree;&#xD;
    prmFALLINGPAUSE = 0;&#xD;
    prmVERTLOOKANGLEDELTA = 4.99 Degree;&#xD;
    prmHORLOOKANGLEDELTA = 90 Degree/4.;&#xD;
    prmSKYCOLOR = RGBColor[0.58, 0.77, 0.96];&#xD;
    prmTEXTURESIZE = 16;&#xD;
    prmTERRAINBLOCKSN = 5000;&#xD;
    prmCLOUDSN = 3;&#xD;
    prmFLOORMATERIAL = matSand;&#xD;
    prmRENDERINGENGINE = Automatic;&#xD;
    prmDISABLETRANSPARENCY = False;&#xD;
    prmSMOOTHTERRAIN = True;&#xD;
    prmTERRAINGRAIN = 3;&#xD;
    prmTERRAINOFFSET = 3;&#xD;
    &#xD;
    terrainImg = Import[&amp;#034;http://i.imgur.com/2uAswvI.png&amp;#034;];&#xD;
    ClearAll[&amp;#034;mat*&amp;#034;];&#xD;
    materials =&#xD;
      {matGrass -&amp;gt; {1, 1},&#xD;
       matStone -&amp;gt; {1, 2},&#xD;
       matDirt -&amp;gt; {1, 3},&#xD;
       matPlanks -&amp;gt; {1, 5},&#xD;
       matPlate -&amp;gt; {1, 7},&#xD;
       matBricks -&amp;gt; {1, 8},&#xD;
       matCobblestone -&amp;gt; {2, 1},&#xD;
       matBedrock -&amp;gt; {2, 2},&#xD;
       matSand -&amp;gt; {2, 3},&#xD;
       matGravel -&amp;gt; {2, 4},&#xD;
       matWood -&amp;gt; {2, 5},&#xD;
       matLeaves -&amp;gt; {2, 7},&#xD;
       matMossStone -&amp;gt; {3, 5},&#xD;
       matObsidian -&amp;gt; {3, 6},&#xD;
       matGlass -&amp;gt; {4, 2},&#xD;
       matWhiteWool -&amp;gt; {5, 16},&#xD;
       matGrayWool -&amp;gt; {5, 15},&#xD;
       matDarkGrayWool -&amp;gt; {5, 14},&#xD;
       matMagentaWool -&amp;gt; {5, 13},&#xD;
       matPinkWool -&amp;gt; {5, 12},&#xD;
       matPurpleWool -&amp;gt; {5, 10},&#xD;
       matBlueWool -&amp;gt; {5, 9},&#xD;
       matLightBlueWool -&amp;gt; {5, 8},&#xD;
       matCyanWool -&amp;gt; {5, 7},&#xD;
       matGreenWool -&amp;gt; {5, 5},&#xD;
       matLimeWool -&amp;gt; {5, 4},&#xD;
       matYellowWool -&amp;gt; {5, 3},&#xD;
       matOrangeWool -&amp;gt; {5, 2},&#xD;
       matRedWool -&amp;gt; {5, 1},&#xD;
       matClouds -&amp;gt; {1, 12},&#xD;
       matSilver -&amp;gt; {2, 8},&#xD;
       matGold -&amp;gt; {2, 9}&#xD;
       };&#xD;
    &#xD;
    dirVectors = {{0, 1, 0}, {1, 0, 0}, {0, -1, 0}, {-1, 0, 0}, {0, &#xD;
        0, -1}, {0, 0, 1}};&#xD;
    vtc = {{0, 0}, {1, 0}, {1, 1}, {0, 1}};&#xD;
    vertCoords = # - {1, 1, 1} &amp;amp; /@ {{0, 0, 0}, {0, 0, 1}, {1, 0, 1}, {1, &#xD;
         0, 0}, {0, 1, 0}, {0, 1, 1}, {1, 1, 1}, {1, 1, 0}};&#xD;
    faceCoords = {{7, 6, 5, 8}, {3, 7, 8, 4}, {2, 3, 4, 1}, {6, 2, 1, &#xD;
        5}, {5, 8, 4, 1}, {3, 2, 6, 7}};&#xD;
    &#xD;
    filename = &amp;#034;save.mmc&amp;#034;;&#xD;
    &#xD;
    &#xD;
    initMaterials[] := Block[{},&#xD;
       nMat = Length@materials;&#xD;
       Evaluate[materials[[All, 1]]] = Range[nMat];&#xD;
       matAir = 0;&#xD;
       With[{ts = prmTEXTURESIZE},&#xD;
        textures = &#xD;
         ImageTake[terrainImg, ts (#1 - 1) + {1, ts}, &#xD;
            ts (#2 - 1) + {1, ts}] &amp;amp;&#xD;
          @@@ (materials[[All, 2]])&#xD;
        ];&#xD;
       textures[[matClouds]] = &#xD;
        Image[Array[{1, 1, 1} &amp;amp;, {prmTEXTURESIZE, prmTEXTURESIZE}]];&#xD;
       ClearAll[transparentQ];&#xD;
       Do[transparentQ[mat] = &#xD;
         MemberQ[{matLeaves, matGlass, matClouds, matAir}, mat], {mat, 0, &#xD;
         nMat}];&#xD;
       If[! prmDISABLETRANSPARENCY,&#xD;
        textures[[matLeaves]] = &#xD;
         ImageData[&#xD;
           textures[[&#xD;
            matLeaves]]] /. {{1., 1., 1.} -&amp;gt; {0., .5, 0., 0.}, {r_, g_, &#xD;
             b_} :&amp;gt; {r, g, b, 1.}};&#xD;
        textures[[matGlass]] = &#xD;
         ImageData[&#xD;
           textures[[&#xD;
            matGlass]]] /. {{1., 1., 1.} -&amp;gt; {5., .5, .1, 0.}, {r_, g_, &#xD;
             b_} :&amp;gt; {r, g, b, 1.}};&#xD;
        textures[[matClouds]] = &#xD;
         Array[{1, 1, 1, .75} &amp;amp;, {prmTEXTURESIZE, prmTEXTURESIZE}];&#xD;
        ];&#xD;
       ];&#xD;
    &#xD;
    initIcons[] := Block[{},&#xD;
       icons = Graphics3D[{ EdgeForm@None, Texture[#],&#xD;
            Polygon[# &amp;amp; /@ vertCoords[[#]], &#xD;
               VertexTextureCoordinates -&amp;gt; vtc] &amp;amp; /@ faceCoords},&#xD;
           Lighting -&amp;gt; &amp;#034;Neutral&amp;#034;, Boxed -&amp;gt; False, ImageSize -&amp;gt; 64, &#xD;
           Background -&amp;gt; Black] &amp;amp;&#xD;
         /@ textures;&#xD;
       setterbar = Column[SetterBar[&#xD;
            Dynamic[&#xD;
             palette[[&#xD;
              curBlockType]], {(palette[[&#xD;
                  curBlockType]] = #) &amp;amp;, (updatePalette[]; &#xD;
                DialogReturn[]) &amp;amp;}], #] &amp;amp;&#xD;
          /@ Partition[Thread[Range[nMat] -&amp;gt; icons], 6, 6, {1, 1}, {}]&#xD;
         ];&#xD;
       palette = {matStone, matCobblestone, matBricks, matDirt, matPlanks,&#xD;
          matWood, matLeaves, matGlass, matPlate};&#xD;
       curBlockType = 1;&#xD;
       updatePalette[];&#xD;
       ];&#xD;
    &#xD;
    updatePalette[] := (paletteGfx = Image[&#xD;
         GraphicsRow[icons[[palette]],&#xD;
          Evaluate[Frame -&amp;gt; Array[# == curBlockType &amp;amp;, 9]],&#xD;
          Evaluate[FrameStyle -&amp;gt; Directive[White, AbsoluteThickness@3]],&#xD;
          Background -&amp;gt; Black&#xD;
          ], ImageSize -&amp;gt; 500]);&#xD;
    &#xD;
    updateCubes[] := (cucubes = Flatten@cubes;);&#xD;
    &#xD;
    saveGame[file_] := &#xD;
      Export[file, {pos, viewDir, moveDir, strafeDir, palette, &#xD;
         curBlockType, SparseArray@blocks} // Compress, &amp;#034;Text&amp;#034;];&#xD;
    &#xD;
    loadGame[file_] := Block[{p, vd, md, sd, pal, cbt, bl},&#xD;
       If[! FileExistsQ[file], MessageDialog[&amp;#034;File not found&amp;#034;]; Return[]];&#xD;
       {p, vd, md, sd, pal, cbt, bl} = Uncompress@Import[file, &amp;#034;Text&amp;#034;];&#xD;
       {pos, viewDir, moveDir, strafeDir, palette, curBlockType} = {p, vd,&#xD;
          md, sd, pal, cbt};&#xD;
       blocks = Normal@bl;&#xD;
       dim = Dimensions@blocks;&#xD;
       {prmWORLDWIDTH, prmWORLDHEIGHT} = Rest@dim;&#xD;
       initFloor[];&#xD;
       initCubes[]; updateCubes[]; updatePalette[]; getSelection[];&#xD;
       FinishDynamic[];&#xD;
       ];&#xD;
    &#xD;
    saveDialog[] := CreateDialog[&#xD;
       Grid@{{Dynamic[&amp;#034;Save to file: &amp;#034; &amp;lt;&amp;gt; filename], &#xD;
          FileNameSetter[Dynamic[filename], &amp;#034;Save&amp;#034;]},&#xD;
         {DefaultButton[saveGame[filename]; DialogReturn[]],&#xD;
          CancelButton[]&#xD;
          }}&#xD;
       ];&#xD;
    &#xD;
    loadDialog[] := CreateDialog[&#xD;
       Grid@{{Dynamic[&amp;#034;Load from file: &amp;#034; &amp;lt;&amp;gt; filename], &#xD;
          FileNameSetter[Dynamic[filename], &amp;#034;Open&amp;#034;, {&amp;#034;mmc&amp;#034; -&amp;gt; {&amp;#034;*&amp;#034;}}]},&#xD;
         {DefaultButton[loadGame[filename]; DialogReturn[]],&#xD;
          CancelButton[]&#xD;
          }}&#xD;
       ];&#xD;
    &#xD;
    showBlockChooser[] := CreateDialog[setterbar, {},&#xD;
       WindowSize -&amp;gt; 500,&#xD;
       Background -&amp;gt; Black,&#xD;
       Modal -&amp;gt; True,&#xD;
       WindowFrame -&amp;gt; &amp;#034;Frameless&amp;#034;,&#xD;
       TextAlignment -&amp;gt; Center&#xD;
       ];&#xD;
    &#xD;
    initBlocks[] := (&#xD;
       dim = {prmWORLDWIDTH, prmWORLDWIDTH, prmWORLDHEIGHT};&#xD;
       blocks = Array[0 &amp;amp;, dim];&#xD;
       );&#xD;
    &#xD;
    initCamera[] := Block[{},&#xD;
       pos = {1.5, 1.5, prmVIEWERHEIGHT};&#xD;
       height = Ceiling@prmVIEWERHEIGHT;&#xD;
       moveDir = {1, 1, 0} // Normalize;&#xD;
       viewDir = moveDir;&#xD;
       strafeDir = {1, -1, 0} // Normalize;&#xD;
       respawnPos = Null;&#xD;
       currentBlockPos = newBlockPos = Null;&#xD;
       selection = {};&#xD;
       viewAngle = 0;&#xD;
       ];&#xD;
    &#xD;
    initFloor[] := (floor = With[{w = prmWORLDWIDTH},&#xD;
         {EdgeForm[None],&#xD;
          Texture[textures[[prmFLOORMATERIAL]]],&#xD;
          Polygon[{{0, 0, 0}, {0, w, 0}, {w, w, 0}, {w, 0, 0}},&#xD;
           VertexTextureCoordinates -&amp;gt; {{0, 0}, {w, 0}, {w, w}, {0, w}}]}&#xD;
         ]);&#xD;
    &#xD;
    initCubes[] := Block[{g, type, pointers, faces},&#xD;
       cubes = {Texture@#} &amp;amp; /@ textures;&#xD;
       cubePointers = Developer`ToPackedArray[{{0, 0, 0}}] &amp;amp; /@ textures;&#xD;
       g = ParallelMap[{#, createCube[#]} &amp;amp;,&#xD;
         Position[blocks, b_ /; b &amp;gt; 0]];&#xD;
       Scan[({pointers, faces} = Transpose@#;&#xD;
          type = blockAt@First@pointers;&#xD;
          cubes[[type]] = cubes[[type]]~Join~faces;&#xD;
          cubePointers[[type]] = cubePointers[[type]]~Join~pointers;&#xD;
          ) &amp;amp;,&#xD;
        GatherBy[g, blockAt@First@# &amp;amp;]&#xD;
        ];&#xD;
       ];&#xD;
    &#xD;
    processFalling[] := Block[{i, j, k}, While[&#xD;
        ({i, j, k} = blockPos[pos])[[3]] &amp;gt; height &amp;amp;&amp;amp; &#xD;
         blocks[[i, j, k - height]] == 0,&#xD;
        pos -= {0, 0, 1}; FinishDynamic[]; Pause[prmFALLINGPAUSE]&#xD;
        ]];&#xD;
    &#xD;
    &#xD;
    lookHor[da_] := ({moveDir, strafeDir, viewDir} = &#xD;
        RotationTransform[da, {0., 0., 1.}] /@ {moveDir, strafeDir, &#xD;
          viewDir});&#xD;
    lookVert[da_] :=&#xD;
      If[Abs[viewAngle + da] &amp;lt;= Pi/2,&#xD;
       viewAngle += da;&#xD;
       viewDir = RotationTransform[da, strafeDir]@viewDir&#xD;
       ];&#xD;
    &#xD;
    move[dv_, n_Integer] := Do[move@dv, {n}];&#xD;
    move[dv_] := Block[{newpos, i, j, k, space},&#xD;
       newpos = pos + dv;&#xD;
       If[! inRange@newpos, Return[]];&#xD;
       {i, j, k} = blockPos@newpos;&#xD;
       If[k + 1 &amp;gt; prmWORLDHEIGHT, Return[]];&#xD;
       space = blocks[[i, j, (k - height + 1) ;; k + 1]];&#xD;
       Which[&#xD;
        And @@ Thread[Most@space == 0], pos = newpos,&#xD;
        First@space != 0 &amp;amp;&amp;amp; (And @@ Thread[Rest@space == 0]), &#xD;
        pos = newpos + {0, 0, 1}&#xD;
        ];&#xD;
       processFalling[];&#xD;
       ];&#xD;
    &#xD;
    &#xD;
    processKeyboard[] := (&#xD;
      Switch[CurrentValue[&amp;#034;EventKey&amp;#034;],&#xD;
       &amp;#034;W&amp;#034;, move[prmMOVESTEP  moveDir],&#xD;
       &amp;#034;S&amp;#034;, move[-prmMOVESTEP moveDir],&#xD;
       &amp;#034;A&amp;#034;, move[-prmMOVESTEP strafeDir],&#xD;
       &amp;#034;D&amp;#034;, move[prmMOVESTEP strafeDir],&#xD;
       &amp;#034;w&amp;#034;, move[prmMOVESTEP  moveDir, 2],&#xD;
       &amp;#034;s&amp;#034;, move[-prmMOVESTEP  moveDir, 2],&#xD;
       &amp;#034;a&amp;#034;, move[-prmMOVESTEP  strafeDir, 2],&#xD;
       &amp;#034;d&amp;#034;, move[prmMOVESTEP  strafeDir, 2],&#xD;
       &amp;#034;q&amp;#034;, pos += {0, 0, 1},&#xD;
       &amp;#034;b&amp;#034;, showBlockChooser[],&#xD;
       &amp;#034;r&amp;#034;, (respawnPos = pos),&#xD;
       &amp;#034;x&amp;#034;, saveDialog[],&#xD;
       &amp;#034;l&amp;#034;, loadDialog[],&#xD;
       &amp;#034; &amp;#034;, addCurrentBlock[],&#xD;
       &amp;#034;1&amp;#034;, curBlockType = 1; updatePalette[],&#xD;
       &amp;#034;2&amp;#034;, curBlockType = 2; updatePalette[],&#xD;
       &amp;#034;3&amp;#034;, curBlockType = 3; updatePalette[],&#xD;
       &amp;#034;4&amp;#034;, curBlockType = 4; updatePalette[],&#xD;
       &amp;#034;5&amp;#034;, curBlockType = 5; updatePalette[],&#xD;
       &amp;#034;6&amp;#034;, curBlockType = 6; updatePalette[],&#xD;
       &amp;#034;7&amp;#034;, curBlockType = 7; updatePalette[],&#xD;
       &amp;#034;8&amp;#034;, curBlockType = 8; updatePalette[],&#xD;
       &amp;#034;9&amp;#034;, curBlockType = 9; updatePalette[]&#xD;
       ];&#xD;
      getSelection[];&#xD;
      )&#xD;
    &#xD;
    actions = {&#xD;
       {&amp;#034;MouseDown&amp;#034;, 1} :&amp;gt; deleteCurrentBlock[],&#xD;
       {&amp;#034;MouseUp&amp;#034;, 2} :&amp;gt; (addCurrentBlock[]; getSelection[]),&#xD;
       &amp;#034;MouseMoved&amp;#034; :&amp;gt; getSelection[],&#xD;
       &amp;#034;LeftArrowKeyDown&amp;#034; :&amp;gt; lookHor[prmHORLOOKANGLEDELTA],&#xD;
       &amp;#034;RightArrowKeyDown&amp;#034; :&amp;gt; lookHor[-prmHORLOOKANGLEDELTA],&#xD;
       &amp;#034;UpArrowKeyDown&amp;#034; :&amp;gt; lookVert[prmVERTLOOKANGLEDELTA],&#xD;
       &amp;#034;DownArrowKeyDown&amp;#034; :&amp;gt; lookVert[-prmVERTLOOKANGLEDELTA],&#xD;
       &amp;#034;ReturnKeyDown&amp;#034; :&amp;gt; If[respawnPos =!= Null, move[respawnPos - pos]],&#xD;
       &amp;#034;KeyDown&amp;#034; :&amp;gt; processKeyboard[],&#xD;
       PassEventsDown -&amp;gt; False&#xD;
       };&#xD;
    &#xD;
    inRange = And @@ Thread[{0, 0, 0} &amp;lt; # &amp;lt;= dim] &amp;amp;;&#xD;
    blockAt = blocks[[Sequence @@ #]] &amp;amp;;&#xD;
    setBlock = (blocks[[Sequence @@ #1]] = #2) &amp;amp;;&#xD;
    setMouse[expr_] := MouseAppearance[expr, &amp;#034;Arrow&amp;#034;];&#xD;
    blocksCount[] := Count[blocks, b_ /; b != 0, {3}];&#xD;
    facesCount[] := Count[cubes, Polygon[__], {3}];&#xD;
    blockPos = Ceiling;&#xD;
    &#xD;
    &#xD;
    neighborList[p_] := Block[{cf},&#xD;
       cf = If[transparentQ@blockAt@p,&#xD;
         (blockAt[#] == matAir) &amp;amp;,&#xD;
         (transparentQ@blockAt[#] &amp;amp;)&#xD;
         ];&#xD;
       Quiet[Flatten@&#xD;
         Position[p + # &amp;amp; /@ dirVectors, _?(inRange[#] &amp;amp;&amp;amp; cf[#] &amp;amp;), {1}, &#xD;
          Heads -&amp;gt; False]]&#xD;
       ];&#xD;
    &#xD;
    createCube[coords_] :=&#xD;
      Polygon[coords + # &amp;amp; /@ vertCoords[[#]], &#xD;
         VertexTextureCoordinates -&amp;gt; vtc] &amp;amp; /@ &#xD;
       faceCoords[[neighborList@coords]];&#xD;
    &#xD;
    setCube[coords_, type_] := (&#xD;
      AppendTo[cubes[[type]], createCube[coords]];&#xD;
      AppendTo[cubePointers[[type]], coords];&#xD;
      )&#xD;
    &#xD;
    addBlock[bp : {_Integer, _Integer, _Integer}?inRange] := (&#xD;
       setBlock[bp, palette[[curBlockType]]];&#xD;
       setCube[bp, palette[[curBlockType]]];&#xD;
       updateNeighbors@bp;&#xD;
       );&#xD;
    &#xD;
    neighborCoords[p_] := &#xD;
      Quiet[Cases[&#xD;
        p + # &amp;amp; /@ dirVectors, _?(inRange[#] &amp;amp;&amp;amp; blockAt[#] != matAir &amp;amp;), &#xD;
        1]];&#xD;
    &#xD;
    updateNeighbors[p_] := Block[{np, locs},&#xD;
       np = neighborCoords@p;&#xD;
       locs = &#xD;
        ParallelMap[Position[cubePointers, #, {2}, Heads -&amp;gt; False] &amp;amp;, np];&#xD;
       (cubes[[Sequence @@ (First@#1)]] = createCube@#2) &amp;amp; @@@ &#xD;
        Transpose@{locs, np};&#xD;
       ];&#xD;
    &#xD;
    deleteBlock[bp : {_Integer, _Integer, _Integer}?inRange] := &#xD;
      Block[{loc},&#xD;
       loc = Position[cubePointers, bp, {2}, Heads -&amp;gt; False];&#xD;
       setBlock[bp, 0];&#xD;
       cubePointers = Delete[cubePointers, loc[[1]]];&#xD;
       cubes = Delete[cubes, loc[[1]]];&#xD;
       updateNeighbors@bp;&#xD;
       ];&#xD;
    &#xD;
    addCurrentBlock[] :=&#xD;
      If[newBlockPos != blockPos@pos,&#xD;
       getSelection[];&#xD;
       addBlock@newBlockPos;&#xD;
       move@{0, 0, 0};&#xD;
       getSelection[];&#xD;
       updateCubes[];&#xD;
       ];&#xD;
    &#xD;
    deleteCurrentBlock[] := (&#xD;
       getSelection[];&#xD;
       deleteBlock@currentBlockPos;&#xD;
       getSelection[];&#xD;
       processFalling[];&#xD;
       updateCubes[];&#xD;
       );&#xD;
    &#xD;
    getSelection[] := Block[{flag, found, chain, mp},&#xD;
       flag = False;&#xD;
       mp = MousePosition[&amp;#034;Graphics3DBoxIntercepts&amp;#034;, Null];&#xD;
       currentBlockPos = newBlockPos = Null;&#xD;
       selection = {};&#xD;
       If[mp === Null, Return[]];&#xD;
       v = Normalize[Subtract @@ mp];&#xD;
       If[v.viewDir &amp;lt; 0, v = -v];&#xD;
       found = (flag = (Last@# &amp;lt; 0 || blockAt[blockPos@#] != 0)) &amp;amp;;&#xD;
       chain = NestWhileList[&#xD;
         # + prmTRACESTEP v &amp;amp;,&#xD;
         pos,&#xD;
         (And @@ Thread[{0, 0, -1} &amp;lt; # &amp;lt; dim]) &amp;amp;&amp;amp; (! found@#) &amp;amp;,&#xD;
         1, Ceiling[prmACTIONRANGE/prmTRACESTEP]];&#xD;
       If[flag,&#xD;
        currentBlockPos = blockPos@chain[[-1]];&#xD;
        selection = {EdgeForm@{Black, Thick},&#xD;
          FaceForm[None],&#xD;
          Cuboid[currentBlockPos - 1, currentBlockPos]&#xD;
          };&#xD;
        If[Length@chain &amp;gt; 1, newBlockPos = blockPos@chain[[-2]]];&#xD;
        ];&#xD;
       ];&#xD;
    &#xD;
    (*World generation*)&#xD;
    &#xD;
    randomWalkPattern[nb_, m_, d_] := &#xD;
      Module[{n = prmWORLDWIDTH, q, i0, j0, i1, j1, field, applyAt, &#xD;
        offset, ok, p, next},&#xD;
       field = Array[0 &amp;amp;, {n, n}];&#xD;
       applyAt = &#xD;
        Function[{i, j}, field[[i - m ;; i + m, j - m ;; j + m]] += 1];&#xD;
       offset = RandomInteger[d {-1, 1}, {2}] &amp;amp;;&#xD;
       ok = (m &amp;lt; #1 &amp;lt;= n - m) &amp;amp;&amp;amp; (m &amp;lt; #2 &amp;lt;= n - m ) &amp;amp;;&#xD;
       next = (While[! ok @@ (q = # + offset[]), q]; q) &amp;amp;;&#xD;
       p = Floor[{n, n}/2];&#xD;
       Do[applyAt @@ p; p = next@p, {Round[nb/(2 m + 1)^2]}];&#xD;
       If[prmSMOOTHTERRAIN,&#xD;
        ListConvolve[BoxMatrix[2]/25, field] // Round,&#xD;
        field]&#xD;
       ];&#xD;
    &#xD;
    createTerrain[bc_] := Block[{field},&#xD;
       field = randomWalkPattern[bc, prmTERRAINGRAIN, prmTERRAINOFFSET];&#xD;
       With[{h = Min[field[[##]], prmWORLDHEIGHT]},&#xD;
          blocks[[#1, #2, 1 ;; h]] = &#xD;
           RandomChoice[{matGravel, matStone}, h];&#xD;
          blocks[[#1, #2, 1]] = RandomChoice@{matBedrock, matDirt};&#xD;
          If[1 &amp;lt; h &amp;lt; RandomInteger@{4, 9},&#xD;
           blocks[[#1, #2, h - 1 ;; h]] = matDirt;&#xD;
           If[RandomChoice@{True, False}, blocks[[#1, #2, h]] = matGrass];&#xD;
           ];&#xD;
          ] &amp;amp; @@@ Position[field, b_ /; b &amp;gt; 0, {2}];&#xD;
       ];&#xD;
    &#xD;
    createClouds[nClouds_] := &#xD;
      Block[{cloud, ww = prmWORLDWIDTH, wh = prmWORLDHEIGHT, i, j, h},&#xD;
       Do[&#xD;
         cloud = randomWalkPattern[RandomInteger@{200, 1000}, 1, 2];&#xD;
         {i, j} = RandomInteger[{-ww, ww}/2, 2];&#xD;
         h = RandomInteger@{wh/2, wh};&#xD;
         Quiet[blocks[[#1 + i, #2 + j, h]] = matClouds] &amp;amp; @@@ &#xD;
          Position[cloud, b_ /; b != 0, {2}],&#xD;
         {nClouds}&#xD;
         ];&#xD;
       ];&#xD;
    &#xD;
    &#xD;
    initMaterials[];&#xD;
    initIcons[];&#xD;
    initBlocks[];&#xD;
    createTerrain[prmTERRAINBLOCKSN];&#xD;
    createClouds[prmCLOUDSN];&#xD;
    initFloor[];&#xD;
    initCubes[];&#xD;
    initCamera[];&#xD;
    &#xD;
    updateCubes[];&#xD;
    &#xD;
    scene = Graphics3D[{Dynamic@floor, EdgeForm@None, Dynamic@cucubes, &#xD;
        Dynamic@selection},&#xD;
       ViewVector -&amp;gt; Dynamic@{pos, pos + viewDir},&#xD;
       ViewRange -&amp;gt; prmVIEWRANGE,&#xD;
       PlotRange -&amp;gt; All,&#xD;
       Lighting -&amp;gt; &amp;#034;Neutral&amp;#034;,&#xD;
       Boxed -&amp;gt; False,&#xD;
       BoxRatios -&amp;gt; Automatic,&#xD;
       ImageSize -&amp;gt; &#xD;
        Dynamic@AbsoluteCurrentValue[EvaluationNotebook[], WindowSize],&#xD;
       ViewAngle -&amp;gt; prmVIEWANGLE,&#xD;
       Background -&amp;gt; prmSKYCOLOR,&#xD;
       PlotRangePadding -&amp;gt; 0,&#xD;
       Epilog -&amp;gt; {crosshair, Inset[Dynamic@paletteGfx, Scaled@{.5, .05}]}&#xD;
       ];&#xD;
    &#xD;
    crosshair = {White, AbsoluteThickness@2,&#xD;
       Line[{Scaled@{.49, .5}, Scaled@{.51, .5}}],&#xD;
       Line[{Scaled@{.5, .49}, Scaled@{.5, .51}}]&#xD;
       };&#xD;
    &#xD;
    CreateDocument[&#xD;
      EventHandler[&#xD;
       setMouse@Style[scene, Selectable -&amp;gt; False, Editable -&amp;gt; False],&#xD;
       actions&#xD;
       ],&#xD;
      CellMargins -&amp;gt; 0,&#xD;
      ShowCellBracket -&amp;gt; False,&#xD;
      ShowCellLabel -&amp;gt; False,&#xD;
      &amp;#034;TrackCellChangeTimes&amp;#034; -&amp;gt; False,&#xD;
      WindowElements -&amp;gt; {},&#xD;
      WindowFrame -&amp;gt; &amp;#034;Normal&amp;#034;,&#xD;
      WindowSize -&amp;gt; Full,&#xD;
      &amp;#034;BlinkingCellInsertionPoint&amp;#034; -&amp;gt; False,&#xD;
      &amp;#034;CellInsertionPointCell&amp;#034; -&amp;gt; {},&#xD;
      WindowMargins -&amp;gt; Automatic,&#xD;
      WindowTitle -&amp;gt; &amp;#034;Mathematicraft&amp;#034;,&#xD;
      Background -&amp;gt; Black,&#xD;
      Editable -&amp;gt; False,&#xD;
      NotebookEventActions -&amp;gt; actions,&#xD;
      TextAlignment -&amp;gt; Center,&#xD;
      Deployed -&amp;gt; True,&#xD;
      RenderingOptions -&amp;gt; {&amp;#034;Graphics3DRenderingEngine&amp;#034; -&amp;gt; &#xD;
         prmRENDERINGENGINE}&#xD;
      ];&#xD;
    &#xD;
    blocksCount[]&#xD;
    facesCount[]&#xD;
&#xD;
&#xD;
 &#xD;
&#xD;
&#xD;
  [1]: https://community.wolfram.com//c/portal/getImageAttachment?filename=3364hero.gif&amp;amp;userId=20103&#xD;
  [2]: http://www.minecraftwiki.net/wiki/Classic&#xD;
  [3]: http://mathematica.stackexchange.com/q/5649/219&#xD;
  [4]: http://mathematica.stackexchange.com/q/17557/219&#xD;
  [5]: https://www.google.by/search?q=minecraft+textures&amp;amp;hl=ru&amp;amp;tbo=d&amp;amp;source=lnms&amp;amp;tbm=isch&amp;amp;sa=X&amp;amp;ei=A7QeUe2ZPMnEswaJnICIAQ&amp;amp;ved=0CAoQ_AUoAQ&amp;amp;biw=1174&amp;amp;bih=642&#xD;
  [6]: http://demonstrations.wolfram.com/BlockBuilder/&#xD;
  [7]: http://community.wolfram.com//c/portal/getImageAttachment?filename=8566sdf453qwyrgdfzg9rgdfv.png&amp;amp;userId=11733&#xD;
  [8]: https://community.wolfram.com//c/portal/getImageAttachment?filename=92938.png&amp;amp;userId=20103&#xD;
  [9]: https://community.wolfram.com//c/portal/getImageAttachment?filename=93129.png&amp;amp;userId=20103&#xD;
  [10]: https://community.wolfram.com//c/portal/getImageAttachment?filename=465010.png&amp;amp;userId=20103&#xD;
  [11]: https://community.wolfram.com//c/portal/getImageAttachment?filename=356211.png&amp;amp;userId=20103&#xD;
  [12]: https://community.wolfram.com//c/portal/getImageAttachment?filename=694312.png&amp;amp;userId=20103&#xD;
  [13]: https://community.wolfram.com//c/portal/getImageAttachment?filename=13.gif&amp;amp;userId=20103&#xD;
  [14]: http://pastebin.com/CjFZ3MpW&#xD;
  [15]: http://mathematica.stackexchange.com/q/19669/13&#xD;
  [16]: https://dl.dropbox.com/u/66486755/scene.mmc</description>
    <dc:creator>Boris Faleichik</dc:creator>
    <dc:date>2016-03-02T16:14:21Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/230347">
    <title>Injecting Computation to Homer Simpson</title>
    <link>https://community.wolfram.com/groups/-/m/t/230347</link>
    <description>After reading Stephen Wolfram mind blowing blog post &amp;#034;[url=http://blog.stephenwolfram.com/2014/03/injecting-computation-everywhere-a-sxsw-update/]Injecting Computation Everywhere[/url]&amp;#034;, I tried to reproduce some of the demos he had for his SXSW talk. The results obtained adding just a little bit of code were quite surprising to me. Here is what happened when I injected the data from an accelerometer connected to an Arduino into the Wolfram Language using a few Graphics3D directives, six spheres and a pair of tubes:

[img=width: 800px; height: 472px;]/c/portal/getImageAttachment?filename=ArduinoConnected.jpg&amp;amp;userId=56204[/img]

Tadaa! Homer Simpson started computing the trajectory of a three-dimensional object!

[img=width: 669px; height: 440px;]/c/portal/getImageAttachment?filename=simpson.gif&amp;amp;userId=56204[/img]

To create this experiment I used the same hardware listed by Arnoud in this post  [url=http://community.wolfram.com/groups/-/m/t/160740]&amp;#034;Connecting to an accelerometer over a serial connection&amp;#034;[/url] (but instead of connecting the Arduino to the Raspberry Pi, I used my own laptop):[list]
[*][url=http://www.adafruit.com/products/163]5V ready triple-axis accelerometer (+-3g analog output)[/url]
[*][url=http://www.adafruit.com/products/64]Half-size bread board[/url]
[*][url=http://www.adafruit.com/products/758]Male/male jumper wires[/url]
[*][url=http://www.adafruit.com/products/50]Arduino Uno[/url]
[/list]Once everything has been wired follow these steps:
[list]
[*]First upload the following Sketch to your Arduino:
[/list][code] void setup()
 {   Serial.begin(115200); }
 void loop()
{
  long x = analogRead(0);
  long y = analogRead(1);
  long z = analogRead(2);
  Serial.write(x);
  Serial.write(y);
  Serial.write(z);
  delay(100);
}[/code][list]
[*]Take note of the serial port name that you will be using.
[/list][list]
[*]Connect the Arduino into the Wolfram Language. (The name of my port was &amp;#034;usbmodemfa131&amp;#034;, replace it for the one you are using)
[/list][mcode]serial = DeviceOpen[&amp;#034;Serial&amp;#034;, {&amp;#034;/dev/tty.usbmodemfa131&amp;#034;, &amp;#034;BaudRate&amp;#034; -&amp;gt; 115200}][/mcode][list]
[*]Check the data streamed by the Serial when the accelerometer lies flat:
[/list][mcode]Partition[DeviceReadBuffer[serial], 3][/mcode][list]
[*]Then set a function to scale the three coordinates XYZ between -1g and 1g:
[/list][mcode]g[x_] := Divide[84 - x, -84][/mcode][list]
[*]When the function is applied to the buffer&amp;#039;s values, and the accelerometer lies flat, the average vector should approximate {0,0,1}g
[/list][mcode]Partition[g /@ DeviceReadBuffer[serial], 3] // Mean // N[/mcode][list]
[*]Once the accelerometer has been calibrated, run the following ScheduledTask every 0.1 seconds:
[/list][mcode]RunScheduledTask[ bee = Partition[g /@ DeviceReadBuffer[serial], 3] // Mean // Normalize // N, 0.1][/mcode]
Now, you are ready to dynamically control Homer Simpson&amp;#039;s eyes :)

[mcode]Graphics3D[{
  Orange,
  Sphere[Dynamic[3*bee], 0.2],
  White,
  Sphere[eyeL = {0.1, 0.9, -2}, 1],
  Sphere[eyeR = {0.1, -0.9, -2}, 1],
  Gray,
  Sphere[Dynamic[Normalize[3 bee - eyeL] + eyeL], 0.25],
  Sphere[Dynamic[Normalize[3 bee - eyeR] + eyeR], 0.25],
  Yellow,
  Tube[{{-1, 0, -4}, {3, 0, -4}}, 2.5],
  Tube[{{0.8, 0, -1.8}, {1, 0, -0.8}}, 0.4],
  Brown,
  Sphere[{3, 0, -3}, 2.3]},
 Lighting -&amp;gt; {{&amp;#034;Ambient&amp;#034;, GrayLevel[.1]}, {&amp;#034;Point&amp;#034;, White, Dynamic[3.5 bee]}},
 ViewPoint -&amp;gt; {-0.8, -2.1, 2.5}, 
 ViewVertical -&amp;gt; {-1.2, -0.8, 0.5},
 PlotRange -&amp;gt; {{-4, 3}, {-3.2, 3.2}, {-7.5, 4}},
 Background -&amp;gt; Black,
 Boxed -&amp;gt; False][/mcode]</description>
    <dc:creator>Bernat Espigulé</dc:creator>
    <dc:date>2014-04-02T20:32:43Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/96837">
    <title>Connection to Sphero robot from Mathematica</title>
    <link>https://community.wolfram.com/groups/-/m/t/96837</link>
    <description>I have been an intern at Wolfram Research for the summer and ended up writing [url=http://github.com/philngo/sphero-mathematica]this package[/url] for connecting to a [url=http://www.gosphero.com]Sphero[/url] robot. Sphero is a little spherical robot that you can drive on land or in water from your iPhone, iPad, Android, tablet and (now) from Mathematica.

The package uses [url=http://library.wolfram.com/infocenter/Demos/5726]SerialIO[/url] and Bluetooth to connect to Sphero. I included a sample notebook to help people get started that shows how to load the package, connect via bluetooth, etc. I embedded the SerialIO package as well. 

You&amp;#039;ll need to find figure out which virtual serial port was opened by Bluetooth, but once you do so, this command gets you up and running. I&amp;#039;ve only done this on Mac, so if anyone succeeds on a Windows or Linux machine, would you post how you did it?
[mcode]mySphero = SpheroDeviceConnect[&amp;#034;/dev/tty.Sphero-BWR-RN-SPP&amp;#034;][/mcode]
The package is in its infancy, but already lets you do some pretty cool stuff. My favorite uses Sphero to control a Graphics3D object - twist and turn Sphero in your hand and watch your Graphics3D object mimic its twists and turns on the screen. If you find some other cool ways to interact with Sphero from within Mathematica, please post.

Access Sphero&amp;#039;s data with this command, which returns a List of rules that can be used to access the various data fields that Sphero streams. 
[mcode]SpheroDeviceData[mySphero][/mcode]
I would love feedback on how to improve the package. By the way, Sphero 2.0 (looks very cool) came out today - not sure if this package will still work with it, but let me know if it does.</description>
    <dc:creator>Phil Ngo</dc:creator>
    <dc:date>2013-08-14T21:47:27Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/90931">
    <title>Connecting the Leap Motion controller to Mathematica Using JLink</title>
    <link>https://community.wolfram.com/groups/-/m/t/90931</link>
    <description>So the Leap Motion controller has just been released and I thought I might as well post some code that Todd Gayley and I wrote to connect the Leap Motion and all of its functionality to Mathematica!

Here are the setup instructions:


1. (If you haven&amp;#039;t already) download and install the leap motion
software from [b][url=http://www.leapmotion.com/setup]leapmotion.com/setup[/url][/b]

2. Download the leap motion SDK off their developer page.

3. Connect the Leap Motion to your computer.

4. Find the file &amp;#034;LeapJava.jar&amp;#034; inside of the Leap SDK (It should be under lib.)

5. Open Mathematica and paste this in:
[mcode]Needs[&amp;#034;JLink`&amp;#034;];
ReinstallJava[CommandLine -&amp;gt; &amp;#034;java&amp;#034;, 
  JVMArguments -&amp;gt; 
   &amp;#034;-Djava.library.path=[path to directory CONTAINING LeapJava.jar]&amp;#034;];[/mcode]Now put the path to &amp;#034;LeapJava.jar&amp;#034; in the indicated position, and remember this is the path to the directory containing LeapJava.jar, not the path to LeapJava.jar itself and run the code.

6. Now paste this in:[mcode]AddToClassPath[&amp;#034;[path to LeapJava.jar]&amp;#034;];[/mcode]Replace the indicated area with the path to LeapJava.jar itself and run it.

7. To setup the controller, run this:[mcode]controller = JavaNew[&amp;#034;com.leapmotion.leap.Controller&amp;#034;][/mcode]

You should now be fully connected and ready to go!  To see if it is working you can run:[mcode]Methods[controller][/mcode]and it should return a list of all the methods under the controller.

The main method you want to look at is frame[] which contains all of the information about what is going on.

To see the methods under frame[] run this:[mcode]Methods[controller@frame[]]
[/mcode]
From here you should be able to figure things out like this:
[mcode]controller@frame[]@fingers@count[][/mcode]which should count the visible fingers in the scene.

Also in a final note I want mention that I was having problems accessing the fingers with this:[mcode]controller@frame[]@finger[finger number][/mcode]and so you may want to try:[mcode]controller@frame[]@fingers[]@get[finger number][/mcode]instead.  The same goes for palms.

Well, if you write anything cool using this please post it back up here along with any questions you have.

(I am going to try to post some of the code I have written with this soon :D )</description>
    <dc:creator>Christopher Wolfram</dc:creator>
    <dc:date>2013-08-06T17:30:52Z</dc:date>
  </item>
  <item rdf:about="https://community.wolfram.com/groups/-/m/t/35026">
    <title>Masking faces in a webcam stream</title>
    <link>https://community.wolfram.com/groups/-/m/t/35026</link>
    <description>[b][url=http://www.wolfram.com/community/web/community/groups/-/message_boards/message/34483]Christopher question about augmented reality[/url][/b] reminded me about a concealed identity trick used in video streams when black strips or blurred areas follow character faces. We can do this with Mathematica [b][url=http://reference.wolfram.com/mathematica/ref/FindFaces.html]FindFaces[/url][/b] functionality. There are a few different ways we can overlay transparent or opaque graphic objects and images with a live video stream. One of them is simply using Show. Let us find some simple overlay masks  and what can be better than some glasses, mustaches, goatees, and pipes?
[mcode]mask = Import /@ {
   &amp;#034;http://i.imgur.com/155BjOY.png&amp;#034;,
   &amp;#034;http://i.imgur.com/zq3Qsm6.png&amp;#034;,
   &amp;#034;http://i.imgur.com/kONG87s.png&amp;#034;,
   &amp;#034;http://i.imgur.com/WaQkDsn.png&amp;#034;,
   &amp;#034;http://i.imgur.com/JZqCWEN.png&amp;#034;,
   &amp;#034;http://i.imgur.com/9Y5CcUQ.png&amp;#034;,
   &amp;#034;http://i.imgur.com/5CiN4Af.png&amp;#034;
   }
[/mcode][img]http://i.imgur.com/2vd08f3.png[/img]
Now execute this code and bring a few friends to smile into camera:
[mcode]Manipulate[
 i = CurrentImage[];
 Show[i, Graphics[
   MapThread[
    Inset[If[mix == &amp;#034;RANDOM&amp;#034;, RandomChoice[mask], mix, mix], 
      Offset[{0, -.27 #2}, #1], Center, #2] &amp;amp;,
    {Mean /@ #, -.8 Subtract @@@ #[[All, All, 1]]} &amp;amp;[FindFaces[i]]
    ]
   ], ImageSize -&amp;gt; 450]
 , {{mix, mask[[7]], &amp;#034;&amp;#034;}, mask~Join~{&amp;#034;RANDOM&amp;#034;}, Setter}, 
 FrameMargins -&amp;gt; 0, SynchronousUpdating -&amp;gt; False]
[/mcode][img]http://i.imgur.com/uJOw5Ca.png[/img]
Thanks to our Wolfram volunteers for avid participation! Let me know if you can think of any improvements for this code. 
Have fun!</description>
    <dc:creator>Vitaliy Kaurov</dc:creator>
    <dc:date>2013-02-01T22:50:28Z</dc:date>
  </item>
</rdf:RDF>

