Community RSS Feed
http://community.wolfram.com
RSS Feed for Wolfram Community showing any discussions in tag Staff Picks sorted by activeGoogle Translate Structure (TextStructure like function)
http://community.wolfram.com/groups/-/m/t/1096168
`TextStructure` is a very nice new function in Mathematica. It can create amazing things like:
TextStructure@"If love be blind, love cannot hit the mark."
![TextStructure][1]
Can we do the same for translations?
This piece of code downloads a JSON-like code from google translate without the need for API calls (which I never bothered to learn).
GoogleTranslate[str_String] := GoogleTranslate@str = Import[
StringTemplate["https://translate.googleapis.com/translate_a/single?client=gtx&sl=`1`&tl=`2`&dt=t&q=`3`"][
"pt", "en", URLEncode@str], "JSON"][[1, 1, 1]]
And this other piece of code formats the translation.
MakeBoxes[TranslateElement[main_, down_], _] := GridBox[
{{MakeBoxes@main}, {StyleBox[MakeBoxes@down, "TextElementLabel"]}},
BaseStyle -> "TextElementGrid"]
GoogleTranslateStructure[str_String] := Block[{sentence, words, phrase},
sentence = StringSplit[str, p:"."|"," :> p] //. {a___String, s_String, p:"."|",", b___String} :> {a, s<>p, b};
phrase = Table[
words = StringSplit[sentence[[i]], WhitespaceCharacter];
TranslateElement[Row@Riffle[TranslateElement @@@ Transpose@{words, GoogleTranslate /@ words}, " "], GoogleTranslate@sentence[[i]]]
, {i, Length@sentence}];
If[Length@sentence == 1,
phrase[[1]],
TranslateElement[Row@Riffle[phrase, " "], GoogleTranslate@str]
]
]
A usage example would be:
GoogleTranslateStructure@"Se amor é cego, nunca acerta o alvo."
![GoogleTranslateStructure][2]
Changing the language from English to Japanese (which I don't speak, btw):
![JP][3]
Or French:
![FR][4]
[1]: http://community.wolfram.com//c/portal/getImageAttachment?filename=2017-05-17_004746.png&userId=845022
[2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=2017-05-17_005236.png&userId=845022
[3]: http://community.wolfram.com//c/portal/getImageAttachment?filename=2017-05-17_005549.png&userId=845022
[4]: http://community.wolfram.com//c/portal/getImageAttachment?filename=2017-05-17_005723.png&userId=845022Thales Fernandes2017-05-17T03:57:58ZBuilding volatility objects – data science approach
http://community.wolfram.com/groups/-/m/t/1164432
*NOTE: all Wolfram Language code and data are available in the attached notebook at the end of the post.*
----------
Novel method for volatility objects building is being discussed in the attached note. Machine learning techniques are being applied to financial derivatives volatility building and the method of predictions fits sample data well. The resulting objects are reasonable, they build fast and produce logically correct estimates within given domains. As such, data science route offers interesting alternative to traditional modelling assumptions.
![enter image description here][1]
#Introduction#
Volatility plays critical role in the modelling and valuation of financial derivatives, and therefore it is not surprising to see continuous attention and focus of many quants and researchers alike on this subject, pattern decomposition and process modelling. Knowing the 'right' volatility and being able to estimate its path in the future is therefore seen as critical ingredient of consistent derivatives pricing.
Financial volatilities are either given (quoted by the broker-dealers) or implied (derived from option prices). By its nature, financial products volatilities are generally 'forward-looking' rather than being historical / realised volatilities. This phenomenon stems from the principles of risk-neutral pricing. When the volatility is quoted, its origin is in many instances derived from volatility models. These can be simple (such as B/S) or more complex. Local volatility SABR and the 'mixture' models such as SVLV are currently the most used volatility models in the market. To operate properly, all these models require extensive calibration to the market data.
We propose *alternative method* for volatility object building that utilises data science approach. Using Mathematica's routines for machine learning, we use **predictor** functionality to build volatility oaths based on 'learning' from quoted data. We will look at three volatility objects - **FX**, **Equity** and **Interest rate** swaptions to show Mathematica's capabilities in the construction and management of volatility objects by 'learning' from given examples. The method is generally fast and can be fully automated. This improves its usability and future application in quantitative finance.
#FX Volatility#
FX volatility is quoted in the market either in 1D or 2D directions. The former is generally a vector of at-the-money (ATM) volatilities for different option maturities, whilst the latter is a 2D surface that in addition to maturity dimension introduces option strikes. These are generally shown on horizontal axis with quoted expression as FX delta. 50 delta is equal to ATM, 10 and 25 delta represent out-of--the money (OTM) calls whilst 75 and 90 reflect puts.
The non-negativity of FX market leads to a log-normal assumptions about the FX data distribution, and therefore the nature of quoted volatility is log-normal (or also known as relative volatility).
We use the recent FX volatility data for JPY/BRL currency cross. Option maturity range from 1 day to 10 years and the FX smile is defined for both calls and puts on the above strike scale.
fxmat = {1/360, 7/360, 14/360, 21/360, 30/360, 60/360, 90/360,
120/360, 150/360, 180/360, 1, 1.5, 2, 3, 4, 5, 7, 10};
fxdelta = {10, 25, 50, 75, 90};
xtbl = Table[{i, j}, {i, fxmat}, {j, fxdelta}] // Transpose;
fxvols0 = {{28.9347360822946, 23.266, 18.69473828125, 17.336,
17.082}, {22.817, 18.889, 18.7004305555556, 14.768,
14.435}, {22.462, 18.476, 18.6928159722222, 14.352,
14.026}, {22.741, 18.775, 18.6928532986111, 14.673,
14.356}, {23.885, 20.609, 18.6928532986111, 17.419,
17.814}, {23.41, 20.449, 18.6928159722222, 17.174,
17.161}, {23.811, 20.843, 18.7004305555556, 17.395,
17.229}, {24.998, 21.005, 18.6759444444444, 17.188,
17.089}, {23.849, 20.471, 18.7022222222222, 16.766,
16.488}, {23.107, 20.095, 18.6926666666667, 16.466,
16.117}, {23.099, 19.807, 18.7308888888889, 15.833,
15.346}, {22.2539294889054, 20.404, 18.578, 16.29,
15.8708687695123}, {22.7761138678155, 20.642, 18.8, 16.449,
16.1821988373345}, {22.0841109536103, 20.326, 18.7, 16.059,
15.4239471417806}, {22.6021123295428, 20.563, 18.6926666666667,
16.158, 15.6147034126386}, {21.4177604234308, 20.017,
18.7308888888889, 15.563, 14.4858894630447}, {22.4813659703195,
20.508, 18.7078518518519, 15.849,
15.023193562278}, {23.0150689352065, 20.75, 18.7104691358025,
16.003, 15.3165637372759}};
fxvols = fxvols0 // Transpose;
The FX volatility surface looks as follows:
TableForm[fxvols // Transpose, TableHeadings -> {fxmat, fxdelta}]
![enter image description here][2]
We can visualise it as follows:
ListPlot3D[fxvols, PlotLabel -> Style["JPY/BRL vol surface", 14]]
![enter image description here][3]
##Training predictor on the FX volatility data##
We use the quoted volatility data as a 'training set' to discover the pattern in the data for predictor purposes. **Predict** function is our main tool for this task and we build two objects:
- Vol object with Gaussian process method
- Vol object with Random forest method
We first build the training set object and format it in required direction
fxvoldata0 =
Table[{fxmat[[i]], fxdelta[[j]]} -> fxvols[[j, i]], {j, 1,
Length[fxdelta]}, {i, 1, Length[fxmat]}];
fxvoldata1 = Flatten[fxvoldata0, 1];
Using the data object, we now train two predictors:
fxvmodelGP =
Predict[fxvoldata1, PerformanceGoal -> "Quality",
Method -> "GaussianProcess"]
fxvmodelRF =
Predict[fxvoldata1, PerformanceGoal -> "Quality",
Method -> "RandomForest"]
![enter image description here][4]
and examine the information about each predictor function
{PredictorInformation[fxvmodelGP],
PredictorInformation[fxvmodelRF]} // Row
![enter image description here][5]
We can now test the predictors on same sample data:
{fxvmodelGP[{7, 10}], fxvmodelRF[{7, 10}], fxvmodelGP[{1/2, 50}],
fxvmodelRF[{1/2, 50}]}
> {22.2577, 22.2484, 18.6702, 18.6932}
We observe decent fit to the original data. Using the model, we can now build the entire volatility object, filling the gaps in the quoted spectrum:
fxmodres =
Table[{i, j, fxvmodelGP[{i, j}]}, {i, 0.5, 10, 0.25}, {j, 5, 95, 5}];
fxmoevals = Flatten[fxmodres, 1];
ListPlot3D[%,
PlotLabel ->
Style["Predicted FX Vol Surface: Gaussian Process approach", Blue,
Italic, 15], AxesLabel -> {"Tenor", "Delta"},
ColorFunction -> "Rainbow", ImageSize -> 400]
![enter image description here][6]
Gaussian process model builds smooth and well-behaved volatility surface in both dimensions. The model nicely smooches the edges observed in the original data
fxmodres2 =
Table[{i, j, fxvmodelRF[{i, j}]}, {i, 0.5, 10, 0.25}, {j, 5, 95,
10}];
Flatten[fxmodres2, 1];
ListPlot3D[%, ColorFunction -> "TemperatureMap",
PlotLabel ->
Style["Predicted FX Vol Surface: Random Forest approach", Blue,
Italic, 15], AxesLabel -> {"Tenor", "Delta"},
ColorFunction -> "TemperatureMap", ImageSize -> 400]
![enter image description here][7]
The nature of the Random forest model means that the modelled surface looks step-wise. If the smoothness is preferable for the vol object construction, then Gaussian process is a better choice.
#Equity volatility#
We now look at the Equity volatility data and will build vol objects in a similar way to the FX case above. Equity volatility data exist in the 2D surface format : (i) in option maturity dimension and (ii) option strike dimension. In this way they closely resemble the FX volatilities. By nature, the equity volatilities are also log-normal since equity prices are always positive.
Equity options maturity typically range from 1 month up to 5 years, whereas option strikes - expressed in terms of 'moneyness' - range between 40 and 200%.
We take the recent Nikkei 225 equity volatility data defined on the grid mentioned above:
eqdates = {1/12, 2/12, 3/12, 6/12, 1, 3/2, 2, 3, 4, 5};
eqmoney = {0.4, 0.6, 0.8, 0.9, 0.95, 0.975, 1, 1.025, 1.05, 1.1, 1.2,
1.3, 1.5, 1.75, 2};
eqv = {{54.743, 42.171, 33.275, 24.208, 20.015, 17.999, 16.541,
15.679, 15.173, 15.752, 18.185, 20.298, 24.619, 27.651,
29.413}, {46.068, 39.862, 29.681, 22.664, 19.624, 18.228, 17.138,
16.358, 15.777, 15.358, 16.869, 18.244, 21.776, 24.987,
27.007}, {42.368, 38.084, 27.929, 22.151, 19.673, 18.547, 17.639,
16.995, 16.55, 16.053, 16.726, 17.704, 20.029, 23.105,
25.135}, {42.136, 34.798, 25.302, 21.623, 19.971, 19.281, 18.721,
18.289, 17.966, 17.57, 17.655, 18.439, 19.792, 21.623,
23.704}, {38.829, 30.246, 23.945, 21.428, 20.393, 19.97, 19.614,
19.322, 19.087, 18.762, 18.574, 18.876, 19.975, 21.059,
22.035}, {35.555, 28.012, 23.123, 21.135, 20.363, 20.046, 19.775,
19.547, 19.358, 19.079, 18.844, 18.951, 19.739, 20.737,
21.493}, {33.111, 26.718, 22.555, 20.905, 20.28, 20.022, 19.799,
19.608, 19.446, 19.198, 18.952, 18.968, 19.518, 20.418,
21.127}, {30.028, 25.228, 21.861, 20.629, 20.171, 19.981, 19.813,
19.667, 19.54, 19.339, 19.108, 19.058, 19.347, 20.034,
20.687}, {28.217, 24.273, 21.397, 20.413, 20.049, 19.896, 19.761,
19.641, 19.536, 19.366, 19.156, 19.084, 19.245, 19.765,
20.341}, {26.918, 23.453, 20.936, 20.115, 19.812, 19.683, 19.568,
19.466, 19.376, 19.227, 19.033, 18.952, 19.04, 19.439, 19.936}};
eqvols = eqv // Transpose;
TableForm[eqv, TableHeadings -> {eqdates, eqmoney}]
![enter image description here][8]
This is the equity volatility surface quoited in the market
eqtab = Table[{eqdates[[i]], eqmoney[[j]], eqv[[i, j]]}, {i, 1,
Length[eqdates]}, {j, 1, Length[eqmoney]}];
Flatten[eqtab, 1];
ListPlot3D[%, PlotLabel -> Style["Nikkei 225 vol surface", 14]]
![enter image description here][9]
The skew above is a typical feature of the equity markets.
##Training predictor on the Nikkei volatility data##
We first configure the vol data object for the modelling purposes
eqs = Table[{eqdates[[i]], eqmoney[[j]]} -> eqv[[i, j]], {i, 1,
Length[eqdates]}, {j, 1, Length[eqmoney]}];
eqdataset = Flatten[eqs, 1];
and then train two predictors:
- Gaussian process
- Neural network
eqvolmodelGP = Predict[eqdataset, Method -> "GaussianProcess"]
eqvolmodelNN = Predict[eqdataset, Method -> "NeuralNetwork"]
![enter image description here][10]
Obtain information about each method:
{PredictorInformation[eqvolmodelGP], PredictorInformation[eqvolmodelNN]}
![enter image description here][11]
We test each predictor on a sample data
{eqvolmodelGP[{1, 0.6}], eqvolmodelNN[{1, 0.6}], eqvolmodelGP[{3, 2}], eqvolmodelNN[{3, 2}]}
> {29.8471, 31.0239, 20.6998, 20.9776}
We can see a decent fit to the original data.
We now generate the full volatility surface by extending the boundaries outside the original domain:
eqmodres =
Table[{i, j, eqvolmodelGP[{i, j}]}, {i, 0.5, 10, 0.25}, {j, 0.2, 3,
0.1}];
eqmodres = Flatten[eqmodres, 1];
ListPlot3D[%,
PlotLabel ->
Style["Nikkei EQ Vol Surface: Gaussian Process approach", Blue,
Italic, 15], AxesLabel -> {"Tenor", "Money"},
ColorFunction -> "Rainbow", ImageSize -> 400]
![enter image description here][12]
eqmodres =
Table[{i, j, eqvolmodelNN[{i, j}]}, {i, 0.5, 10, 0.25}, {j, 0.2, 3,
0.1}];
eqmodres = Flatten[eqmodres, 1];
ListPlot3D[%,
PlotLabel ->
Style["Nikkei EQ Vol Surface: Neural network approach", Blue,
Italic, 15], AxesLabel -> {"Tenor", "Money"},
ColorFunction -> "TemperatureMap", ImageSize -> 400]
![enter image description here][13]
Both predictors produce smooth volatility objects, with Neural network being closer to the underlying data.
#Swaption cube#
Our third example is based on more complex case - 3D swaption cube. Interest rate swaptions are defined on 3D scale - (i) option maturity, (ii) underlying swap maturity and (iii) strike. This makes the case more complicated. Option maturities range from 1month to 30 years, swap maturities are typically between 1year and 30 years and strikes are usually in the range of -200 to 200 where the number represents the basis point offset from ATM swap rate.
Since in many currencies the rates are now negative, the market has moved from quoting the log-normal volatilities to the normal ones. These are also known as 'absolute' volatilities and are usually expressed on rates convention basis.
We take the recent EUR swaption volatility data and create training set for the Mathematica's predictor:
optmat = {1/2, 3/4, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 15, 20, 25, 30};
swmat = {1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 15, 20, 25, 30};
swox = {-200, -150, -100, -75, -50, -25, 0, 25, 50, 75, 100, 150, 200};
![enter image description here][14]
Fully-defined cubes, such as the EUR one are generally large:
Map[Length, swv] // Total
> 3315
##Training the predictor on the EUR swaption volatility data##
We first build the vol object from the data
swvres1 =
Table[{optmat[[i]], swmat[[k]], swox[[j]]} -> swv[[i, j]], {k, 1,
Length[swmat]}, {j, 1, Length[swox]}, {i, 1, Length[optmat]}];
swvres2 = Flatten[swvres1, 2];
and create three predictors:
- Gaussian process
- Neural network
- Random forest
swvolmodGP =
Predict[swvres2, Method -> "GaussianProcess",
PerformanceGoal -> "Quality"]
![enter image description here][15]
swvolmodNN =
Predict[swvres2, Method -> "NeuralNetwork",
PerformanceGoal -> "Quality"]
![enter image description here][16]
swvolmodRF =
Predict[swvres2, Method -> "RandomForest",
PerformanceGoal -> "Quality"]
![enter image description here][17]
Whilst the Neural network and Random forest are generally fast to build, the Gaussian process is slower
{Predict[swvres2, Method -> "GaussianProcess",
PerformanceGoal -> "Quality"] // Timing,
Predict[swvres2, Method -> "NeuralNetwork",
PerformanceGoal -> "Quality"] // Timing,
Predict[swvres2, Method -> "RandomForest",
PerformanceGoal -> "Quality"] // Timing}
![enter image description here][18]
We test the predictors on the sample data
{swvolmodGP[{10, 1, 0}], swvolmodNN[{10, 1, 0}],
swvolmodRF[{10, 1, 0}]}
> {0.698667, 0.698477, 0.69649}
We again observe decent fit to the original data.
Using the three vol models, we predict the volatility data and fill the cubes:
volmodGP =
Table[swvolmodGP[{i, j, k}], {i, 1, 5, 0.5}, {j, 1, 10,
1}, {k, -100, 100, 50}];
volmodNN =
Table[swvolmodNN[{i, j, k}], {i, 1, 10, 0.25}, {j, 1, 10,
1}, {k, -200, 200, 25}];
volmodRF =
Table[swvolmodRF[{i, j, k}], {i, 1, 10, 0.25}, {j, 1, 10,
1}, {k, -200, 200, 25}];
{ListPlot3D[Table[volmodNN[[i]], {i, Length[volmodNN]}],
ColorFunction -> "Rainbow",
PlotLabel -> Style["EUR Swaption cube: NN approach", 12],
AxesLabel -> {"Opt Tenor", "Swap Tenor"}, ImageSize -> 250],
ListPlot3D[Table[volmodGP[[i]], {i, Length[volmodGP]}],
ColorFunction -> "TemperatureMap",
PlotLabel -> Style["EUR Swaption cube: GP approach", 12],
AxesLabel -> {"Opt Tenor", "Swap Tenor"}, ImageSize -> 250],
ListPlot3D[Table[volmodRF[[i]], {i, Length[volmodRF]}],
ColorFunction -> "NeonColors",
PlotLabel -> Style["EUR Swaption cube: RF approach", 12],
AxesLabel -> {"Opt Tenor", "Swap Tenor"}, ImageSize -> 250]}
![enter image description here][19]
All three predictors correctly show the flattening of the surfaces for higher strikes. Neutral network produces the smoothest surface, and additionally is the fastest to build the object. As such, it may be well suitable for live market data and active volatility management.
#Conclusion#
The objective of this note was to show that machine learning method offered viable alternative to traditional volatility models using single or multi-factor processes.. Data science approach is attractive as it actively 'learns' from available data samples and adjusts its parameters when either market conditions or direction change. Built-in Mathematica's Predict function provides excellent routines for volatility data fitting and three tested methods provide reasonable prediction for the modelled data. More importantly, higher dimensions, such as cubes, pose no problem for object rendering. This remains robust and fast.
[1]: http://community.wolfram.com//c/portal/getImageAttachment?filename=FXVolObject.jpg&userId=387433
[2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=sadf453yrtehdgfs.png&userId=11733
[3]: http://community.wolfram.com//c/portal/getImageAttachment?filename=107222.png&userId=20103
[4]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-08-14at13.29.58.png&userId=20103
[5]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-08-14at13.30.14.png&userId=20103
[6]: http://community.wolfram.com//c/portal/getImageAttachment?filename=86825.png&userId=20103
[7]: http://community.wolfram.com//c/portal/getImageAttachment?filename=76126.png&userId=20103
[8]: http://community.wolfram.com//c/portal/getImageAttachment?filename=59597.png&userId=20103
[9]: http://community.wolfram.com//c/portal/getImageAttachment?filename=19818.png&userId=20103
[10]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-08-14at13.38.23.png&userId=20103
[11]: http://community.wolfram.com//c/portal/getImageAttachment?filename=622210.png&userId=20103
[12]: http://community.wolfram.com//c/portal/getImageAttachment?filename=546611.png&userId=20103
[13]: http://community.wolfram.com//c/portal/getImageAttachment?filename=1029512.png&userId=20103
[14]: http://community.wolfram.com//c/portal/getImageAttachment?filename=938213.png&userId=20103
[15]: http://community.wolfram.com//c/portal/getImageAttachment?filename=599914.png&userId=20103
[16]: http://community.wolfram.com//c/portal/getImageAttachment?filename=1037115.png&userId=20103
[17]: http://community.wolfram.com//c/portal/getImageAttachment?filename=182316.png&userId=20103
[18]: http://community.wolfram.com//c/portal/getImageAttachment?filename=944017.png&userId=20103
[19]: http://community.wolfram.com//c/portal/getImageAttachment?filename=138619.png&userId=20103Igor Hlivka2017-08-13T18:06:19Z[CALL] Common mistakes in using Wolfram Language & Mathematica
http://community.wolfram.com/groups/-/m/t/1070264
[Wolfram Language][1] (WL) is a powerful multi-paradigm programing language. There is a set of common mistakes that repeatedly tend to entrap new users. **This is a call to describe such mistakes building a "black-listing" guide for novice coders.** Please consider contributing. I suggest following simple rules (with gratitude adapted from a [similar effort][2]):
- One topic per answer
- Focus on non-advanced uses (it is intended to be useful for beginners and as a question closing reference)
- Include a self explanatory title in header style (example: "# Basic built-in function syntax"; see [syntax guide][3] )
- Explain the symptoms, the mechanism behind the scenes and all possible causes and solutions you can think of. Be sure to include a beginner's level explanation (and a more advance one too, if you can)
*Please, use "**Reply**" to a specific comment for structured clarity of nested comments.*
----------
## Table of Contents
- [Basic syntax of built-in functions][4]
- [Learn how to use the Documentation Center effectively][5]
- [Sorting numerical data and the behavior of Sort][6]
- [What does @#(%=<\[!} et cetera mean?][7]
- [Consider Reap/Sow Instead of AppendTo][8]
- [Case sensitivity and typos][9]
[1]: https://www.wolfram.com/language
[2]: https://mathematica.stackexchange.com/q/18393/13
[3]: http://community.wolfram.com/groups/-/m/t/270507
[4]: http://community.wolfram.com/groups/-/m/t/1069885
[5]: http://community.wolfram.com/groups/-/m/t/1070285
[6]: http://community.wolfram.com/groups/-/m/t/1070705
[7]: http://community.wolfram.com/groups/-/m/t/1070946
[8]: http://community.wolfram.com/groups/-/m/t/1084289
[9]: http://community.wolfram.com/groups/-/m/t/1084920Vitaliy Kaurov2017-04-23T23:54:23ZTransfer an artistic style to an image
http://community.wolfram.com/groups/-/m/t/1093926
![enter image description here][16]
# Introduction
Back in [Wolfram Summer School 2016][1] I worked on the project "Image Transformation with Neural Networks: Real-Time Style Transfer and Super-Resolution", which got later [published on Wolfram Community][2]. At the time I had to use the MXNetLink package, but now all the needed functionality is built-in, so here is a top-level implementation of artistic style transfer with Wolfram Language. This is a slightly simplified version of the original method, as it uses a single VGG layer to extract the style features, but a full implementation is of course possible with minor modifications to the code. You can also find this example in the docs:
[NetTrain][3] >> Applications >> Computer Vision >> Style Transfer
# Code
Create a new image with the content of a given image and in the style of another given image. This implementation follows the method described in Gatys et al., *A Neural Algorithm of Artistic Style*. An example content and style image:
![enter image description here][4]
To create the image which is a mix of both of these images, start by obtaining a pre-trained image classification network:
vggNet = NetModel["VGG-16 Trained on ImageNet Competition Data"];
Take a subnet that will be used as a feature extractor for the style and content images:
featureNet = Take[vggNet, {1, "relu4_1"}]
![enter image description here][5]
There are three loss functions used. The first loss ensures that the "content" is similar in the synthesized image and the content image:
contentLoss = NetGraph[{MeanSquaredLossLayer[]}, {1 -> NetPort["LossContent"]}]
![enter image description here][6]
The second loss ensures that the "style" is similar in the synthesized image and the style image. Style similarity is defined as the mean-squared difference between the Gram matrices of the input and target:
gramMatrix = NetGraph[{FlattenLayer[-1], TransposeLayer[1 -> 2], DotLayer[]}, {1 -> 3, 1 -> 2 -> 3}];
styleLoss = NetGraph[{gramMatrix, gramMatrix, MeanSquaredLossLayer[]},
{NetPort["Input"] -> 1, NetPort["Target"] -> 2, {1, 2} -> 3, 3 -> NetPort["LossStyle"]}]
![enter image description here][7]
The third loss ensures that the magnitude of intensity changes across adjacent pixels in the synthesized image is small. This helps the synthesized image look more natural:
l2Loss = NetGraph[{ThreadingLayer[(#1 - #2)^2 &], SummationLayer[]}, {{NetPort["Input"], NetPort["Target"]} -> 1 -> 2}];
tvLoss = NetGraph[<|
"dx1" -> PaddingLayer[{{0, 0}, {1, 0}, {0, 0}}, "Padding" -> "Fixed" ],
"dx2" -> PaddingLayer[{{0, 0}, {0, 1}, {0, 0}}, "Padding" -> "Fixed"],
"dy1" -> PaddingLayer[{{0, 0}, {0, 0}, {1, 0}}, "Padding" -> "Fixed" ],
"dy2" -> PaddingLayer[{{0, 0}, {0, 0}, {0, 1}}, "Padding" -> "Fixed"],
"lossx" -> l2Loss, "lossy" -> l2Loss, "tot" -> TotalLayer[]|>,
{{"dx1", "dx2"} -> "lossx", {"dy1", "dy2"} -> "lossy",
{"lossx", "lossy"} -> "tot" -> NetPort["LossTV"]}]
![enter image description here][8]
Define a function that creates the final training net for any content and style image. This function also creates a random initial image:
createTransferNet[net_, content_Image, styleFeatSize_] := Module[{dims = Prepend[3]@Reverse@ImageDimensions[content]},
NetGraph[<|
"Image" -> ConstantArrayLayer["Array" -> RandomReal[{-0.1, 0.1}, dims]],
"imageFeat" -> NetReplacePart[net, "Input" -> dims],
"content" -> contentLoss,
"style" -> styleLoss,
"tv" -> tvLoss|>,
{"Image" -> "imageFeat",
{"imageFeat", NetPort["ContentFeature"]} -> "content",
{"imageFeat", NetPort["StyleFeature"]} -> "style",
"Image" -> "tv"},
"StyleFeature" -> styleFeatSize ] ]
Define a [NetDecoder][9] for visualizing the predicted image:
meanIm = NetExtract[featureNet, "Input"][["MeanImage"]]
> {0.48502, 0.457957, 0.407604}
decoder = NetDecoder[{"Image", "MeanImage" -> meanIm}]
![enter image description here][10]
The training data consists of features extracted from the content and style images. Define a feature extraction function:
extractFeatures[img_] := NetReplacePart[featureNet, "Input" ->NetEncoder[{"Image", ImageDimensions[img],
"MeanImage" ->meanIm}]][img];
Create a training set consisting of a single example of a content and style feature:
trainingdata = <|
"ContentFeature" -> {extractFeatures[contentImg]},
"StyleFeature" -> {extractFeatures[styleImg]}
|>
Create the training net whose input dimensions correspond to the content and style image dimensions:
net = createTransferNet[featureNet, contentImg,
Dimensions@First@trainingdata["StyleFeature"]];
When training, the three losses are weighted differently to set the relative importance of the content and style. These values might need to be changed with different content and style images. Create a loss specification that defines the final loss as a combination of the three losses:
perPixel = 1/(3*Apply[Times, ImageDimensions[contentImg]]);
lossSpec = {"LossContent" -> Scaled[6.*10^-5],
"LossStyle" -> Scaled[0.5*10^-14],
"LossTV" -> Scaled[20.*perPixel]};
Optimize the image using [NetTrain][11]. [LearningRateMultipliers][12] are used to freeze all parameters in the net except for the [ConstantArrayLayer][13]. The training is best done on a GPU, as it will take up to an hour to get good results with CPU training. The training can be stopped at any time via Evaluation -> Abort Evaluation:
trainedNet = NetTrain[net,
trainingdata, lossSpec,
LearningRateMultipliers -> {"Image" -> 1, _ -> None},
TrainingProgressReporting ->
Function[decoder[#Weights[{"Image", "Array"}]]],
MaxTrainingRounds -> 300, BatchSize -> 1,
Method -> {"ADAM", "InitialLearningRate" -> 0.05},
TargetDevice -> "GPU"
]
![enter image description here][14]
Extract the final image from the [ConstantArrayLayer][15] of the trained net:
decoder[NetExtract[trainedNet, {"Image", "Array"}]]
![enter image description here][16]
[1]: https://education.wolfram.com/summer/school/alumni/2016/salvarezza/
[2]: http://community.wolfram.com/groups/-/m/t/885941
[3]: http://reference.wolfram.com/language/ref/NetTrain.html
[4]: http://community.wolfram.com//c/portal/getImageAttachment?filename=I_432.png&userId=95400
[5]: http://community.wolfram.com//c/portal/getImageAttachment?filename=O_179.png&userId=95400
[6]: http://community.wolfram.com//c/portal/getImageAttachment?filename=O_180.png&userId=95400
[7]: http://community.wolfram.com//c/portal/getImageAttachment?filename=O_181.png&userId=95400
[8]: http://community.wolfram.com//c/portal/getImageAttachment?filename=O_182.png&userId=95400
[9]: http://reference.wolfram.com/language/ref/NetDecoder.html
[10]: http://community.wolfram.com//c/portal/getImageAttachment?filename=O_184.png&userId=95400
[11]: http://reference.wolfram.com/language/ref/NetTrain.html
[12]: http://reference.wolfram.com/language/ref/LearningRateMultipliers.html
[13]: http://reference.wolfram.com/language/ref/ConstantArrayLayer.html
[14]: http://community.wolfram.com//c/portal/getImageAttachment?filename=O_185.png&userId=95400
[15]: http://reference.wolfram.com/language/ref/ConstantArrayLayer.html
[16]: http://community.wolfram.com//c/portal/getImageAttachment?filename=I_466.png&userId=95400Matteo Salvarezza2017-05-15T10:33:59ZUK Place Name Generator
http://community.wolfram.com/groups/-/m/t/1158843
Every time I see another article pop up wherein somebody trains a neural net to generate names of something, I feel obligated to go back and run the same training sets through my dead-simple setup to do the same thing in the Wolfram Language. This pass at [generating British place names][1] seemed like a fun one, since the training set goes deep into small towns and villages across the UK. So I'll start with a small set of functions I've used before for this sort of thing — "decamel" is a utility function to clean up and split apart any incidental camelcased words that show up in predictions; "nameGenerator" does some minimal string processing on a provided list of Wolfram Language entities or raw strings, and produces a SequencePredictorFunction; "predictionList" produces a list of results of varying lengths using a predictor function:
decamel[str_] :=
StringTrim[
StringJoin[
StringSplit[
str, {RegularExpression["([a-z])([A-Z])"] -> "$1 $2",
RegularExpression["([0-9])([A-Z])"] -> "$1 $2",
RegularExpression["([a-z])([0-9])"] -> "$1 $2"}]]]
predictionList[func_, num_, min_, max_, decam_: True] :=
If[decam == True,
decamel /@
Table[StringTrim@
StringReplace[
func["|", "RandomNextElement" -> RandomInteger[{min, max}]],
"|" -> " "], num],
Table[StringTrim@
StringReplace[
func["|", "RandomNextElement" -> RandomInteger[{min, max}]],
"|" -> " "], num]]
nameGenerator[entOrString_List, extractor_: "SegmentedWords"] :=
Block[{names, list},
With[{heads = DeleteDuplicates[Head /@ entOrString]},
Which[
heads === {Entity},
names = CommonName[DeleteMissing[entOrString]];
list =
StringRiffle[StringSplit["|" <> # <> "|"], "|"] & /@ names;
SequencePredict[list, FeatureExtractor -> extractor],
heads === {String},
names = StringTrim /@ DeleteMissing[entOrString];
list =
StringRiffle[StringSplit["|" <> # <> "|"], "|"] & /@ names;
SequencePredict[list, FeatureExtractor -> extractor]]]]
So I'll start by importing the file used in the original article, and just grabbing place names out of it (it also includes some numerical IDs, and county names):
uknames =
Import["https://cdn.obrienmedia.co.uk/cdn/farfuture/5-\
1bFjgWmjONhWhk9sGAeYzlIzhwHRSBIF_Fzr55UYs/mtime:1425905283/sites/\
default/files/uk_towns_and_counties.csv"];
namelist = uknames[[All, 2]] // Rest // DeleteDuplicates;
In[84]:= Select[namelist, StringContainsQ["("]][[;; 10]]
Out[84]= {"Wdig (Goodwick)", "Vermuden's Drain (Forty Foot)", "Valley \
(Dyffryn)", "Usk (Brynbuga)", "Upper Largo (Kirkton of Largo)", \
"Uisage Dubh (Black Water)", "Tyddewi (St David's)", "Treorci \
(Treorchy)", "Treorchy (Treorci)", "Trent (Piddle)"}
I don't want to try to generate names with parenthetical transcriptions or alternate forms, so let's split those up and treat the parentheticals as distinct names for training purposes:
In[78]:= splitter[rec_] :=
StringTrim[StringSplit[rec, "("], {" ", ")"}]
In[105]:= newlist = Flatten[splitter /@ namelist];
In[83]:= Length[newlist]
Out[83]= 41245
Then all that's left to do is make the SequencePredictorFunction, and generate some names (removing predicted names that were already in the training set, or that end with words of fewer than 5 characters):
ukpl = nameGenerator[newlist, "SegmentedCharacters"];
Multicolumn[
Complement[
Select[predictionList[ukpl, 400, 8, 18],
StringLength[StringSplit[#, " "][[-1]]] > 4 &], newlist], 6]
![enter image description here][2]
Not all of these are gems (or even readable), but there's some good stuff in here — my personal favorites include:
- Blackleaze Ferry
- Bleburgh
- Farmlingthorpe
- Kirphook
- Low of Gosbe
- Roebucklecott
- Stainton Doirkmill
- Tattin Grime
- Toberland Garker
- Winstapleton Dalby
[1]: https://medium.com/@hondanhon/i-trained-a-neural-net-to-generate-british-placenames-9460e907e4e9
[2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=4329uknamepredictions.png&userId=21095Alan Joyce2017-08-04T17:43:14ZCastlevania Stage 1 Demo
http://community.wolfram.com/groups/-/m/t/1161057
Update 1: added examples about side-scrolling and the reason for a time-scale factor.
Since making [Flappy Bird in the Wolfram Language][1], I decided to up the ante with a playable demo of a video game classic, Castlevania for the Nintendo Entertainment System. To limit the scope of the project, I only recreated the first stage, but modified it to play like "horde mode" where enemies continuously spawn:
![Gameplay][2]
The package can be downloaded from my github account: [CastlevaniaDemo_WolframLanguage][3].
# Controls #
I wanted to avoid using keyboard events since they block, i.e. you can't hold the right-arrow key and press another key (to jump) at the same time. Instead, I use an Xbox controller because I'm on a Windows computer and it has the device drivers already installed. It makes connecting the controller as easy as turning in on. If you want to use a different controller, then you may need to re-map some of the [`ControllerState`][4]s.
![enter image description here][5]
I tried to mimic the original controls where you can move left and right, as well as crouch or stand using the directional pad. Only two buttons are used: one to jump and one to crack the whip. The crouch doesn't help much in such a simple level, but it's there nonetheless.
# Design #
## Side-scrolling ##
----------
A 2-D side-scroller moves the screen as the player advances. This is straightforward to achieve using a dynamic [`PlotRange`][6]. I found the image from a basic Google search, cropped it, and used [`Inset`][7] to place it in a [`Graphics`][8] expression. To put everything in an understandable coordinate system, I adjusted the relative size (4.19) until the image spanned the vertical space from 0 to 1. The single static image `background` is
![Background][9]
Side-scrolling is implemented by updating the [`PlotRange`][10] dynamically over an otherwise static graphic:
Manipulate[
Graphics[
Inset[background, {0, 0}, Scaled[{0, 0}], 4.19],
Frame -> True, PlotRangeClipping -> True,
PlotRange -> {{Dynamic[scrollPos], Dynamic[screenWidth + scrollPos]}, {0, 1}}
],
{{scrollPos, 0, "Screen Position"}, 0, 3},
{{screenWidth, 1, "Screen Width"}, 1, 2}
]
![Scroll Position][11]
The above code is a demonstration, whereas the game updates `scrollPos` as the player moves. There are additional restrictions based on where the player is located, such that the player can walk to the edge of the screen, but the screen stops moving when the player is close.
In the following, `csX` is the x-direction on the directional pad. Its value is +1 if held to the right and -1 if held to the left.
Attributes[moveRight] = {HoldRest};
moveRight[csX_, fps_] := (
(* face player in direction of movement *)
If[!playerFacingRight, playerFacingRight=True];
(* update player sprite and screen position *)
With[{p=playerX+(playerXvel/fps)*csX}, If[p < 3.55, playerX=p]];
If[scrollPos < 2.7 && ((scrollPos+screenWidth)/2.4 < playerX), scrollPos+=(scrollSpeed/fps)]
);
Attributes[moveLeft] = {HoldRest};
moveLeft[csX_, fps_] := (
(* face player in direction of movement *)
If[playerFacingRight, playerFacingRight=False];
(* update player sprite and screen position *)
With[{p=playerX+(playerXvel/fps)*csX}, If[0.07 < p, playerX=p]];
If[scrollPos > 0 && (scrollPos+screenWidth/2.3 > playerX), scrollPos-=(scrollSpeed/fps)]
);
## Controlling Update Speed ##
----------
Instead of using a [`ScheduledTask`][12] to update at a fixed frame rate, I opted to let [`Dynamic`][13] update as fast as possible. This is reminiscent of old computer games that were tied to the processor's frequency. To get around this I use a global variable `fps` to slow down the apparent motion. Said differently, the front end updates as fast as possible, but the distance a sprite moves per kernel-state-update is proportional to `fps`.
The previous section shows `fps` in the player update:
p = playerX + (playerXvel / fps)*csX
The player's horizontal-position `playerX` is updated by the fixed horizontal-velocity `playerXvel`, but proportional to `fps`. If `fps` is larger, then the distance traveled per-kernel-update is smaller.
For every moving sprite, however, the front end has to communicate more with the kernel. This causes the entire game to slow down proportionally to the number of dynamically updating expressions. I more-or-less get around this detail by linearly adjusting the global `fps` variable proportional to the number of active sprites. The rough idea is that if each sprite takes 2 units of `fps` time to update, then I should take away 2 units for each active enemy.
As a toy example of this behavior, look at the following independent example:
fps = 10;
objList = <||>;
createShape[index_Integer] := Module[{pos = RandomReal[{-30, 0}, 2]},
object[index] := (
(* updated object position *)
pos += 0.3/fps;
(* if object leaves the screen, remove it from the list of objects, otherwise output graphic *)
If[pos.pos > 30^2,
objList = KeyDrop[objList, index]; object[index] =.,
{EdgeForm[Directive[Thick]], FaceForm[None], Rectangle[pos, pos + {5, 5}]}
]
);
objList = <|objList, index :> object[index]|>
]
i = 1;
Column[{
Button["Add", createShape[i++]],
Graphics[Dynamic[Values[objList]],
PlotRange -> {{-30, 30}, {-30, 30}}, Frame -> True,
ImageSize -> Medium],
Dynamic[Length[objList]]
}]
![enter image description here][14]
Ignoring that this code produces a front-end memory leak, what you should observe is that the rectangles motion slows down as more are added. Once a few rectangles leave the "world", the remaining rectangles speed up.
In my game code where I create enemies, I use a variable `tsf` (time-scale-factor) which is just a local renaming of `fps`. It decreases when an enemy is put in motion i.e. initialized, and increases when the enemy dies or leaves the screen. Below I only show the parts where `tsf` is adjusted. `speedTrigger` is used as a flag to prevent `tsf` from updating more than once.
initializeEnemy[index, "ghoulRight"] := (
. . .
speedTrigger=True; tsf-=2;
. . .
enemy[index] := (
If[ !dying
(* if exited map *)
If[xPos > 3.8,
. . .
If[speedTrigger, tsf+=2; speedTrigger = False];
. . .
],
(* if dead, after death animation, position enemy off screen, update score *)
If[speedTrigger, tsf+=2; speedTrigger = False];
. . .
]
)
)
## Sprites before hit boxes ##
----------
It makes more sense to place and size the sprites into the graphics before I create hit boxes. Sprite maps can be [found online][15] for many classic games. I found a character map for the main character sprite and cropped each frame. There are 12 unique frames in total for my chosen character movement, but for smooth animations some frames can be found in multiple sequences:
- four frames for walking
![enter image description here][16]
- four frames for whipping while standing
![enter image description here][17]
- four frames for whipping while crouched
![enter image description here][18]
- three frames for character death
![enter image description here][19]
The above images have already been adjusted. First, I applied [`RemoveBackground`][20] and [`ImageCrop`][21] to each frame. If you naively stopped here and simply updated the first argument of [`Inset`][22] with these images, then your character would appear glitchy when switching between images. The reason is that the inset images are of different sizes. I needed to "normalize" the images to a common size.
The whip frames have the largest extent (see the above images that are selected). The whip hangs below the feet while crouched. It also extends far to the left/right when fully extended. I chose common image size of 105 x 42. I first used [`ImagePad`][23] to pad each image up to the chosen size. Then I created functions for fine-tuning the sprite positions:
imageRotateLeft[im_, amount_] := If[amount == 0,
im,
ImageAssemble[{{ImageTake[im, {1, -1}, {1 + amount, -1}], ImageTake[im, {1, -1}, {1, amount}]}}]
]
imageRotateRight[im_, amount_] := If[amount == 0,
im,
ImageAssemble[{{ImageTake[im, {1, -1}, {-amount, -1}], ImageTake[im, {1, -1}, {1, -1 - amount}]}}]
]
I use a combination of [`Manipulate`][24] expressions to align and center each frame. One aligns one frame with another:
Manipulate[
left = tt[[1]]; right = tt[[2]];
Framed[ImageCompose[left, imageRotateLeft[right, n]], FrameMargins -> None],
{n, 0, 10, 1}
]
![enter image description here][25]
The other aligns a frame with its reflection:
Manipulate[
left = tt[[m]]; right = ss[[m]];
Framed[ImageCompose[imageRotateRight[left, n], imageRotateLeft[right, n]], FrameMargins -> None],
{n, 0, 10, 1}, {m, 1, 4, 1}
]
![Reflection Alignment][26]
Having a reflection-aligned set of frames is useful when making hit boxes. That way we only need one hit box for standing and one for crouching, regardless of facing right or left. Moreover, having "one-size-fits-all" for the images, the inset size argument remains fixed.
The ghoul enemy proceeds similarly, but is simpler since it is only two animation frames and does not include a moving weapon.
## Character Controls ##
----------
I use a [`Module`][27] to mimic "static" variables for the player character.
Module[{playerFacingRight=True, playerCrouched=False, playerX=0.288, playerY=0.282,
playerXvel=0.2, scrollSpeed=0.2, im=walkingRight[[1]],
playerAnimationCounter=1, playerFrame=1, walkAnimationDelay=25,
grav=0.5, jumpVel=0.5, playerYvel=0, jumpCounter=0, previousJumpState=False,
whipFrame=1, whipAnimationCounter=1, previousWhipState=False, whipping=False, whipAnimationDelay=8,
whipStandLeftBox,whipStandRightBox,whipCrouchedLeftBox,whipCrouchedRightBox,
crouchedBox,standingBox,playerBox,whipSoundCheck=True},
Attributes[standWhipUpdate] = {HoldAll};
standWhipUpdate[whipBox_] := ( . . . )
Attributes[crouchWhipUpdate] = {HoldAll};
crouchWhipUpdate[whipBox_] := ( . . . )
Attributes[moveRight] = {HoldRest};
moveRight[csX_, fps_] := ( . . . )
Attributes[moveLeft] = {HoldRest};
moveLeft[csX_, fps_] := ( . . . )
Attributes[character] = {HoldRest};
character[{csX_,csY_,csB_, csB2_}, enemyBox_, whipBox_, fps_, playerDead_] := (
.
.
.
Inset[im, {playerX,playerY}, Scaled[{0.5,0.5}], 0.562]
)
]
Within this module I define a number of functions that share the scoped module variables. The main function is `character` that, after processing the current state of the player, returns the `Inset` graphics primitive.
The `cs*` inputs are `ControllerState["X3"]`, `ControllerState["Y3"]`, `ControllerState["B1"]`, and `ControllerState["B2"]`, respectively. The other inputs are the hit boxes, time delay, and a predicate flag for whether the player is dead.
The `character` function logic is as follows:
<ul>
<li>Has the character hit an enemy?</li>
<ul><li> yes -> you're dead</li> <li>no -> you're not dead</li>
</ul>
<li>If not dead, then</li>
<ul>
<li>Check the jump button</li>
<li>Check the whip button</li>
</ul>
<li>Add gravity effect (you fall regardless of whether you're alive)</li>
<li>if not dead, </li>
<ul>
<li>if jumping, </li>
<ul>
<li>are you whipping?</li>
<ul>
<li>yes</li>
<li>no</li>
</ul>
</ul>
<li>else you're walking</li>
<ul>
<li>are you whipping?</li>
<ul>
<li>yes</li>
<li>no -> update normal walking controls</li>
</ul>
</ul>
</ul>
<li>else you're dead -> update death animation
<li>output Inset primitive with updated position and sprite image</li>
</ul>
</ul>
For example, while on the ground and not whipping, I use a [`Which`][28] expression to decide the update:
Which[
(* walking to the right *)
csX==1 && (csY==0 || csY==1),
moveRight[csX, fps];
playerCrouched=False;
(* animate character sprite *)
playerAnimationCounter += 1;
If[Mod[playerAnimationCounter,walkAnimationDelay]==0,
playerAnimationCounter=1;
im=walkingRight[[If[playerFrame==4,playerFrame=1,++playerFrame]]]
],
(* walking to the left *)
csX==-1 && (csY==0 || csY==1),
moveLeft[csX, fps];
playerCrouched=False;
(* animate character sprite *)
playerAnimationCounter += 1;
If[Mod[playerAnimationCounter,walkAnimationDelay]==0,
playerAnimationCounter=1;
im=walkingLeft[[If[playerFrame==4,playerFrame=1,++playerFrame]]]
],
(* crouched to the right *)
csX==1 && csY==-1,
If[!playerFacingRight, playerFacingRight=True];
playerCrouched=True;
playerAnimationCounter=9;
im=crouchRight,
(* crouched to the left *)
csX==-1 && csY==-1,
If[playerFacingRight, playerFacingRight=False];
playerCrouched=True;
playerAnimationCounter=9;
im=crouchLeft,
(* crouch down *)
csX==0 && csY==-1,
If[!playerCrouched,
playerCrouched=True;
im = If[playerFacingRight, crouchRight, crouchLeft]
],
(* stand up *)
csX==0 && csY==1,
playerFrame=1;
If[playerCrouched,
playerCrouched=False;
im = If[playerFacingRight, walkingRight, walkingLeft][[playerFrame]]
]
]
You'll see that the animation of the character involves a delayed update. So not only am I slowing the apparent motion with the `fps` variable, I also slow the apparent animation by using a `playerAnimationCounter` with a fixed `walkAnimationDelay`.
I did include a double jump, but it is also somewhat irrelevant for the simplicity of the level.
(* check state of jump button *)
Switch[{previousJumpState,csB},
{False,True}, If[jumpCounter<2,jumpCounter+=1;playerYvel=(jumpVel/fps);previousJumpState=True],
{True,False}, previousJumpState=False;
];
## Enemy Controls ##
----------
I use a [`Module`][29] to mimic "static" variables for the enemies as well.
Attributes[createEnemy] = {HoldRest};
createEnemy[index_Integer, whipBox_, enemyBox_, tsf_, score_, enemyReady_] := Module[
{xPos, yPos, xVel, yVel, im, speedTrigger=False, dying=False, size,
enemyAnimationCounter=1, enemyFrame=1, enemyAnimationDelay=25},
enemyBox = {{0,0},{0,0}};
initializeEnemy[index, "ghoulRight"] := ( . . . )
initializeEnemy[index, "ghoulLeft"] := ( . . . )
]
Within this module I define a number of functions that share the scoped module variables. The main function is `initializeEnemy` that itself defines another function `enemy`. It seems complicated, but this type of scoping helps avoid memory leaks and treats each enemy as an independent object. The `enemy` function processes the current state of the enemy and returns an `Inset` graphics primitive.
Adding a new type of enemy is straightforward; create a new `initializeEnemy` code block with an overloaded patter e.g. `initializeEnemy[index, "medusaRight"]`.
The ghoul logic is much simpler:
<ul>
<li>is the enemy dead?</li>
<ul> <li>no</li>
<ul>
<li>update position</li>
<li>have you hit the whip box? yes -> set dead flag</li>
<li>have you left the screen? yes -> move off screen and set ready flag</li>
</ul>
<li>yes -> do death animation and reset</li>
</ul>
<li>output Inset primitive with updated position and sprite image</li>
</ul>
</ul>
For example, the movement logic is much simpler than the character. It only moves left or right.
initializeEnemy[index, "ghoulLeft"] := (
. . .
enemy[index] := (
If[!dying,
(* if alive *)
(* update position and enemy hit box *)
xPos += xVel/tsf;
enemyBox = {{-0.037,-0.075}+{xPos,yPos},{0.029,0.081}+{xPos,yPos}};
. . .
]
)
. . .
)
## Hit Box Management ##
----------
The hit boxes follow the [`Rectangle`][30] syntax in that they specify a box by the lower-left and upper-right corners only. All hit boxes are un-rotated rectangles. I check whether the hit boxes are left/right or above/below of each other. If they are not, then they must be overlapping.
impactQ = Compile[{{box1,_Real,2},{box2,_Real,2}},
(* if one hitbox is on the left side of the other *)
If[box1[[2,1]] < box2[[1,1]] || box1[[1,1]] > box2[[2,1]], Return[False]];
(* if one hitbox is above the other *)
If[box1[[2,2]] < box2[[1,2]] || box1[[1,2]] > box2[[2,2]], Return[False]];
True
];
The enemies and player's hit boxes are given as relative distances from their centers. For example,
(* define character hit boxes *)
whipStandLeftBox = {{-0.2`,0.02`},{-0.06`,0.07`}};
whipStandRightBox = {{0.06`,0.02`},{0.2`,0.07`}};
whipCrouchedLeftBox = {{-0.204`,-0.02`},{-0.052`,0.034`}};
whipCrouchedRightBox = {{0.052`,-0.02`},{0.204`,0.034`}};
crouchedBox = {{-0.02`,-0.06`},{0.02`,0.038`}};
standingBox = {{-0.02`,-0.06`},{0.02`,0.086`}};
playerBox = standingBox + {{playerX,playerY},{playerX,playerY}};
I determined the hit boxes using a Manipulate:
Manipulate[
playerX = 0.288; playerY = 0.282; im = ImageReflect[Import["whip3.png"], Right];
Graphics[
{Inset[im, {playerX, playerY}, Scaled[{0.5, 0.5}], 0.562],
FaceForm[None], EdgeForm[Directive[Red, Thick]],
Dynamic[ Rectangle[{playerX + left, playerY + bottom}, {playerX + right, playerY + top}] ]
},
Frame -> False, ImageSize -> Medium, PlotRange -> {{0, 1}, {0, 0.5}}
]
,
{{left, -0.2}, -0.5, 0.5, LabeledSlider},
{{bottom, -0.2}, -0.5, 0.5, LabeledSlider},
{{right, 0.2}, -0.5, 0.5, LabeledSlider},
{{top, 0.2}, -0.5, 0.5, LabeledSlider}
]
![enter image description here][31]
The whip is more complicated. The whip's hit box is only active when the whip is fully extended, and its position depends on which direction the player is facing. For example, while in the air movement is allowed. Note in the following that `whipframe` #3 is when the whip box is active. Otherwise, within the `standWhipUpdate` function the whip box is moved off screen so it doesn't hit anything.
(* while in the air *)
playerFrame=1; playerCrouched=False;
Which[
csX==1, moveRight[csX, fps],
csX==-1, moveLeft[csX, fps]
];
If[whipping,
(* if whipping, player is in standing position *)
standWhipUpdate[whipBox];
(* whipping in the air is unique as movement is allowed; hit boxes need to update with the player's movement *)
playerBox = standingBox;
If[whipFrame==3, whipBox = If[playerFacingRight, whipStandRightBox, whipStandLeftBox] + {{playerX,playerY},{playerX,playerY}}],
(* if not whipping, the legs tuck midway through the jump *)
If[-(jumpVel/fps)/1.5 < playerYvel < (jumpVel/fps)/1.5,
playerBox = crouchedBox;
im = If[playerFacingRight, crouchRight, crouchLeft],
playerBox = standingBox;
im = If[playerFacingRight, walkingRight, walkingLeft][[playerFrame]];
];
]
The hit boxes are passed around between the functions. The player only cares about the enemy hit boxes:
If[
AnyTrue[{enemyBox[1],enemyBox[2],enemyBox[3],enemyBox[4],enemyBox[5],
enemyBox[6],enemyBox[7],enemyBox[8],enemyBox[9],enemyBox[10]},
impactQ[playerBox,#]&
],
playerDead = True; playerCrouched=True;
playerBox = {{1,1},{1,1}};
playerFrame=1;
im = If[playerFacingRight, playerDeathRight, playerDeathLeft][[playerFrame]];
playerAnimationCounter=1;
];
While the enemies only care about the whip hit box:
If[impactQ[whipBox,enemyBox],
dying = True; EmitSound[hitSound]; score++;
enemyBox = {{0,0},{0,0}};
size = 0.035;
enemyFrame=1;
im = enemyDeath[[enemyFrame]];
enemyAnimationCounter=1;
];
## Final bells and whistles ##
----------
The final steps include putting all the code into a larger [`DynamicModule`][32]. I added a static pause screen that also displays the controls as a [`PaneSelector`][33]. You can pause the game theoretically at any time, however the other dynamic updates can sometimes block the button. It works better by pressing and holding it until the screen appears, or by holding a direction on the directional pad and then press "start".
The "restart" button, that is only active after you die, has the same issue. I haven't figured out how to fix this.
When you restart after dying, you start in a crouched whipping position and your score is reset.
I also added sound effects for the whip and when you hit an enemy. I added the option for background music as an available MIDI file from the ([Video Game Music Headquarters][34]), since the song "Vampire Killer" is so iconic, but I did not loop it.
The enemies are generated based on a [`RandomVariate`][35] taken from a [`NormalDistribution`][36] with mean 100 'cycles' and a wide variance.
playLevel[] := Deploy@DynamicModule[{whipBox={{-10,-10},{-5,-5}}, fps=50,
enemyBox, playerDead=False, score=0, enemyReady, previousButtonState=False, pressed=False, enemyCounter=1,
enemyVariate=RandomVariate[NormalDistribution[100,30]]},
(* start background music *)
(*EmitSound[music];*)
(* use PaneSelector to toggle between pause screen and active game *)
PaneSelector[
{
True ->
Column[{
Graphics[
{
Inset[background, {0,0}, Scaled[{0,0}], 4.19],
Dynamic[Inset[Style[score,Red,Bold,27], {scrollPos+screenWidth-0.1,0.9}]],
Dynamic[
If[playerDead,
If[ControllerState["B7"],
playerDead=False; score=0; character[{1,-1,False,True}, enemyBox, whipBox, fps, playerDead]
];
Inset[Panel[Style[" Play again? \nPress Restart Button.",Red,Bold,27],Background->Black], {scrollPos+screenWidth*0.5,0.5}],
{}
]
],
Dynamic[{
If[(enemyCounter += 1) > enemyVariate,
enemyCounter=1; enemyVariate=RandomVariate[NormalDistribution[100,30]];
initializeEnemy[
First[Pick[{1,2,3,4,5,6,7,8,9,10}, enemyReady /@ {1,2,3,4,5,6,7,8,9,10}], {}],
RandomChoice[{"ghoulLeft","ghoulRight"}]
]
];
enemy[1],enemy[2],enemy[3],enemy[4],enemy[5],
enemy[6],enemy[7],enemy[8],enemy[9],enemy[10],
character[
{
ControllerState["X3"],ControllerState["Y3"],
ControllerState["B1"],ControllerState["B3"]
},
enemyBox, whipBox, fps, playerDead
]
}]
},
Frame -> False,
ImageSize -> Large,
PlotRange -> {{Dynamic[0+scrollPos],Dynamic[screenWidth+scrollPos, TrackedSymbols:>{scrollPos}]}, {0,1}},
PlotRangePadding -> None,
PlotRangeClipping -> True,
ImagePadding -> 1
](*,
LabeledSlider[Dynamic[fps],{1,72}]*)
}],
False -> controls
},
Dynamic[
Switch[{previousButtonState,ControllerState["B8"]},
{False,True}, If[pressed,pressed=False,pressed=True]; previousButtonState=True,
{True,False}, previousButtonState=False
];
pressed
],
ImageSize -> Automatic
](* end PaneSelector *),
Initialization :> (
Do[With[{i=i}, createEnemy[i, whipBox, enemyBox[i], fps, score, enemyReady]], {i,1,10}];
enemy[1]=enemy[2]=enemy[3]=enemy[4]=enemy[5]=enemy[6]=enemy[7]=enemy[8]=enemy[9]=enemy[10] = {};
enemyReady[1]=enemyReady[2]=enemyReady[3]=enemyReady[4]=enemyReady[5] = True;
enemyReady[6]=enemyReady[7]=enemyReady[8]=enemyReady[9]=enemyReady[10] = True;
)
]
[1]: http://community.wolfram.com/groups/-/m/t/1127985
[2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=play.gif&userId=829295
[3]: https://github.com/KMDaily/CastlevaniaDemo_WolframLanguage
[4]: http://reference.wolfram.com/language/ref/ControllerState.html
[5]: http://community.wolfram.com//c/portal/getImageAttachment?filename=Controls2.png&userId=829295
[6]: http://reference.wolfram.com/language/ref/PlotRange.html
[7]: http://reference.wolfram.com/language/ref/Insert.html
[8]: http://reference.wolfram.com/language/ref/Graphics.html
[9]: http://community.wolfram.com//c/portal/getImageAttachment?filename=LevelMap.png&userId=829295
[10]: http://reference.wolfram.com/language/ref/PlotRange.html
[11]: http://community.wolfram.com//c/portal/getImageAttachment?filename=scrollPosition.PNG&userId=829295
[12]: http://reference.wolfram.com/language/ref/ScheduledTask.html
[13]: http://reference.wolfram.com/language/ref/Dynamic.html
[14]: http://community.wolfram.com//c/portal/getImageAttachment?filename=slowDown.PNG&userId=829295
[15]: https://www.spriters-resource.com/
[16]: http://community.wolfram.com//c/portal/getImageAttachment?filename=characterWalking.PNG&userId=829295
[17]: http://community.wolfram.com//c/portal/getImageAttachment?filename=6947characterSpriteExtent.PNG&userId=829295
[18]: http://community.wolfram.com//c/portal/getImageAttachment?filename=characterCrouchExtent.PNG&userId=829295
[19]: http://community.wolfram.com//c/portal/getImageAttachment?filename=characterDeath.PNG&userId=829295
[20]: http://reference.wolfram.com/language/ref/RemoveBackground.html
[21]: http://reference.wolfram.com/language/ref/ImageCrop.html
[22]: http://reference.wolfram.com/language/ref/Inset.html
[23]: http://reference.wolfram.com/language/ref/ImagePad.html
[24]: http://reference.wolfram.com/language/ref/Manipulate.html
[25]: http://community.wolfram.com//c/portal/getImageAttachment?filename=imageAlignment.PNG&userId=829295
[26]: http://community.wolfram.com//c/portal/getImageAttachment?filename=3893reflectionAlignment.PNG&userId=829295
[27]: http://reference.wolfram.com/language/ref/Module.html
[28]: http://reference.wolfram.com/language/ref/Which.html
[29]: http://reference.wolfram.com/language/ref/Module.html
[30]: http://reference.wolfram.com/language/ref/Rectangle.html
[31]: http://community.wolfram.com//c/portal/getImageAttachment?filename=whipbox.PNG&userId=829295
[32]: http://reference.wolfram.com/language/ref/DynamicModule.html
[33]: http://reference.wolfram.com/language/ref/PaneSelector.html
[34]: http://www.vgmusic.com
[35]: http://reference.wolfram.com/language/ref/RandomVariate.html
[36]: http://reference.wolfram.com/language/ref/NormalDistribution.htmlKevin Daily2017-08-09T21:18:36ZCustom Interface Example
http://community.wolfram.com/groups/-/m/t/1156321
![Three screen shots of Feng Shui][1]
My students practice some math skills through interactive apps I develop. *Feng Shui*, for instance, helps them gain proficiency with function composition. I made a Wolfram Language version of *Feng Shui*. The results seemed interesting enough to share on this list.
The user clicks on the cover art and then answers six randomly generated function composition questions. The app tracks stats and reports a score at the end based on speed, accuracy, and difficulty of the questions. A score of 8000 indicates proficiency. Top scores are over 9000.
I've provided the full code below. To make it work, copy the attached FengShuiCover.jpg into where the code says
coverPic=
Then evaluate the code. To start, click on the cover art. When the math question appears, click once anywhere in the interface and start typing an answer. When you have typed a correct answer, you will automatically advance to another question.
**Wolfram Advantage:** In the Wolfram Language, *Feng Shui*'s code shrunk to about a quarter of its HTML/JavaScript/PHP size, partly because of Wolfram's compact syntax and partly because Wolfram can handle math much more efficiently. For instance, to recognize a correct answer, JavaScript has to compare several variations of a polynomial (4x^2 – 9x + 1, 1 – 9x + 4x^2, etc) to what the user has typed, while Wolfram just compares the underlying symbolic expressions once.
**Typing Math:** I feel strongly that end users should not have to know some arcane set of key strokes to type math. In *Feng Shui*, the user types 4x^2-3x+1 but sees the following appear as she types:
![enter image description here][2]
To accomplish this, I reduced the string that the user typed to characters and replaced some patterns of characters with boxes (e.g., **SuperscriptBox[]** or **StyleBox[]**). This simple example Italicizes “x”:
Characters[resp] /. "x" -> StyleBox["x", FontSlant -> Italic]
I have been able to extend this method in other programming languages to more complex math expressions, so I assume that I would be able to in Wolfram too. I'd like to thank the experts on this list, particularly Gianluca Gorni, for providing examples of box manipulation to solve such challenges.
**Set Insertion Point:** A challenge that I was not able to overcome was that the user has to click inside *Feng Shui* before typing. I think that the function I need to set the insertion point inside the interface is **SelectionMove[]**, but I was not able to get it to work for me. Any suggestions would be appreciated.
**Interface Design:** When I first started learning the Wolfram Language, it was easy to make Manipulate[] expressions and deploy them, but it wasn't so obvious how to make and deploy other kinds of user interfaces. Threads on this list indicate that other users have this problem too. *Feng Shui* and the larger project *[Chicken Scratch][3]* are examples of custom interfaces made with the Wolfram Language.
I've tested the .nb (the code listing below) on Mac OS 10.12.6 using Mathematica 11.0.1.0. I've also attached a .cdf file, which I have tested on Mac and Windows. Enjoy. Let me know if you have any questions.
coverPic =
happySound = Sound[{SoundNote["A5", .02, "Crystal"],SoundNote["C7", .1, "Crystal", SoundVolume -> .75]}];
sadSound = Sound[{SoundNote["AFlat2", .07, "Kalimba", SoundVolume -> .5]}];
font = FontFamily -> "Times New Roman";
magBlue = RGBColor["#000099"];
newQ[] := {
state = 2;
hint = "";
qFormNo = RandomInteger[{1, 4}];
qForm = TraditionalForm[{f[g[x]] == "?", (f\[SmallCircle]g)[x] == "?", g[f[x]] == "?", (g\[SmallCircle]f)[x] == "?"}[[qFormNo]]];
Switch[RandomInteger[{1, 4}],
1, (* f(x)=ax+b g(x)=cx+d *)
diffPts += 50;
co = RandomChoice[DeleteCases[Range[-12, 12], 0], 2];
con = RandomInteger[{-99, 99}, 2];
fx = TraditionalForm[f[x] == co[[1]] x + con[[1]]];
gx = TraditionalForm[g[x] == co[[2]] x + con[[2]]];
If[qFormNo < 3,
ans = Expand[Composition[co[[1]] # + con[[1]] &, co[[2]] # + con[[2]] &][x]];
hintTxt = StringForm["substitute and simplify: `1`", TraditionalForm[co[[1]] (g[x]) + con[[1]]]],
ans = Expand[Composition[co[[2]] # + con[[2]] &, co[[1]] # + con[[1]] &][x]];
hintTxt = StringForm["substitute and simplify: `1`", TraditionalForm[co[[2]] (f[x]) + con[[2]]]]],
2, (* f(x)=ax^2 g(x)=bx *)
diffPts += 125;
co = RandomChoice[DeleteCases[Range[-#, #], 0]] & /@ {3, 12};
con = {};
fx = TraditionalForm[f[x] == co[[1]] x^2];
gx = TraditionalForm[g[x] == co[[2]] x];
If[qFormNo < 3,
ans = Expand[Composition[co[[1]] #^2 &, co[[2]] # &][x]];
hintTxt = StringForm["substitute and simplify: `1`", TraditionalForm[co[[1]] (g[x])^2]],
ans = Expand[Composition[co[[2]] # &, co[[1]] #^2 &][x]];
hintTxt = StringForm["substitute and simplify: `1`", TraditionalForm[co[[2]] (f[x])]]],
3, (* f(x)=ax+b g(x)=cx^2 *)
diffPts += 225;
co = RandomChoice[DeleteCases[Range[-#, #], 0]] & /@ {5, 8};
con = RandomInteger[{-24, 24}];
fx = TraditionalForm[f[x] == co[[1]] x + con];
gx = TraditionalForm[g[x] == co[[2]] x^2];
If[qFormNo < 3,
ans = Expand[Composition[co[[1]] # + con &, co[[2]] #^2 &][x]];
hintTxt = StringForm["substitute and simplify: `1`", TraditionalForm[co[[1]] (g[x]) + con]],
ans = Expand[Composition[co[[2]] #^2 &, co[[1]] # + con &][x]];
hintTxt = StringForm["substitute and simplify: `1`", TraditionalForm[co[[2]] (f[x])^2]]],
4, (* f(x)=ax^2+bx+c g(x)=dx+e *)
diffPts += 325;
co = RandomChoice[DeleteCases[Range[-12, 12], 0], 2];
con = RandomInteger[{-12, 12}];
fx = TraditionalForm[f[x] == x^2 + co[[1]] x + con];
gx = TraditionalForm[g[x] == co[[2]] x];
If[qFormNo < 3,
ans = Expand[Composition[#^2 + co[[1]] # + con &, co[[2]] # &][x]];
hintTxt = StringForm["substitute and simplify: `1`", TraditionalForm[(g[x])^2 + co[[1]] (g[x]) + con]],
ans = Expand[Composition[co[[2]] # &, #^2 + co[[1]] # + con &][x]];
hintTxt = StringForm["substitute and simplify: `1`", TraditionalForm[co[[2]] (f[x])]]]],
diffPts += 43*Count[Flatten[Join[{co}, {con}]], x_ /; x < 0]; (* negatives *)
diffPts += 100*Log10[Total[Abs[CoefficientList[ans, x]]]]; (* ans totals *)
pre = StringForm["`1`\t`2`", fx, gx];
q = qForm;
resp = "";
};
keyDown[k_] := {
If[state != 2, Return];
If[MatchQ[ToCharacterCode[k], ({8} | {127} | {46} | Except[{_?NumberQ}])],
bkSp++;
resp = If[resp === Null || StringLength[resp] < 2, Null, StringDrop[resp, -1]],
If[StringContainsQ["x1234567890^-+", k],
resp = If[resp === Null || resp === "", k, resp <> k];
If[ToExpression[resp] == ans,
EmitSound[happySound];
qCt++;
If[qCt < 6, newQ[], youWon[]]],
EmitSound[sadSound]]];
If[state == 2,
q = StringForm["`1``2`",
qForm /. "?" -> "",
If[resp === Null || resp === "", "?",
char = Characters[resp] /.
"x" -> StyleBox["x", FontSlant -> Italic] //.
{a___, b : Longest[Repeated["1" | "2" | "3" | "4" | "5" | "6" | "7" | "8" | "9" | "0"]], c___} -> {a, StringJoin[b], c} /.
{a___, b_, "^", c_, d___} -> {a, SuperscriptBox[b, c], d} /.
a_ -> RowBox[a];
char // DisplayForm]]]
};
youWon[] := {
state = 3;
bTime = Now;
instructions = "";
elTime = DateDifference[aTime, bTime, "Second"];
timePts = 4000 (120/QuantityMagnitude[elTime])^.6;
accPts = 3000 - 53 bkSp;
score = timePts + accPts + diffPts;
If[score < 6500, score = 6500 - (6500 - score)^.8];
If[score > 9500, score = 9500 + (score - 9500)^.8];
score = Round[score];
pre = "";
q = Style[ToString[score] <> " Points", 72, Blue];
hint = "";
};
showHint := {If[state == 2, bkSp += 9; hint = hintTxt]};
start[] := {
score = 0;
diffPts = 0;
bkSp = 0;
qCt = 0;
aTime = Now;
instructions = Style[StringForm["Complete the function rule.\nType x^2 for `1`.", TraditionalForm[x^2]], 18, font, TextAlignment -> Center];
game = EventHandler[Column[{
Pane[Style[StringForm["`1` to go", Dynamic[6 - qCt]], 20, font], {600, 30}, Alignment -> Right, FrameMargins -> {{0, 8}, {0, 8}}],
Pane[Style[Dynamic[pre], magBlue, 26, font], {600, 50}, FrameMargins -> {{20, 0}, {0, 0}}],
Pane[Style[Dynamic[q], magBlue, 36, font], {600, 160}, Alignment -> Center],
Pane[Style[Dynamic[hint], magBlue, 24, font], {600, 60}, Alignment -> Center],
Row[{
Pane[Button[Style["Start Over", 20, font], newGame[], Appearance -> "Frameless"], {120, 100}, Alignment -> {Left, Bottom}, FrameMargins -> {{8, 0}, {8, 0}}],
Pane[Dynamic[instructions], {360, 100}, Alignment -> {Center, Top}, FrameMargins -> {{0, 0}, {0, 8}}],
Pane[Button[Style["Get Hint", 20, font], showHint[], Appearance -> "Frameless"], {120, 100}, Alignment -> {Right, Bottom}, FrameMargins -> {{0, 8}, {8, 0}}]}]
}, Spacings -> 0], {"KeyDown" :> keyDown[CurrentValue["EventKey"]]}];
newQ[]};
newGame[] := {
state = 1;
game = Button[Image[coverPic, ImageSize -> {600, 400}], start[], Appearance -> "Frameless"];
Framed[Panel[Dynamic[game], Background -> White, FrameMargins -> None], FrameMargins -> None]};
newGame[]
[1]: http://community.wolfram.com//c/portal/getImageAttachment?filename=FengShui_3SS.jpg&userId=788861
[2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-07-31at7.41.15AM.png&userId=788861
[3]: http://community.wolfram.com/groups/-/m/t/1149014?p_p_auth=N48t3vn3Mark Greenberg2017-07-31T21:17:47Z[WMP17] Convert Pixel Image into Vector Graphics
http://community.wolfram.com/groups/-/m/t/1160838
As part of my Wolfram Mentorship Program, I chose to work on this interesting and challenging project that combines image processing, artificial intelligence, graph theory and Wolfram Language (who can think of better combination?!). Now since part of the work is confidential and can't be shared (as it uses some of internal WL code), it's useful to mention that this post is not intended to be a full step by step tutorial but rather a conceptual overview on what we are working on. Hope you'll enjoy it and find it useful !
## Project Description ##
Write a function to find an approximation to images as vector graphics (effectively: "convert GIF to SVG'). Flags may be a helpful example to consider. There will be a tradeoff between "fitting' the image, and having a simple "model' in terms of vector graphics.
Category: Algorithms, Artificial Intelligence
Suggested readings:
- http://reference.wolfram.com/language/guide/ImageProcessing.html
- http://reference.wolfram.com/language/ref/Interpolation.html
-------
## Introduction ##
Two typical representations for pictures are raster images and vector graphics. Images are represented by an array of picture-elements called pixels, each has a numerical value to indicate its color. Pixel based image representation has revealed its drawbacks in many applications such as image magnification and printing. Vector graphics defines images using graphics objects like curves and surfaces, rather than discrete points of pixels. The vector representation has many interesting features; such as being scalable, compact and resolution independent.
Conversion between pixel images and vector graphics is always desired. The conversion from vectors to images in WL is straightforward using Rasterize function. However, the opposite is surprisingly not easy at all and has attracted increasing interest in recent years. The following figure illustrates some possible issues in reconstructing line edges and surface colors from an image.
![Possible issues with ImageGraphics[] function][1]
-------
## Approach ##
The conversion process, in general, is divided into three stages: first of all, edge detection is performed to highlight image outline, the result is chain of pixels. Actually, not all those pixels are necessary to reconstruct original image, so a feature extraction stage is required to keep only important pixels (edges, corners, etc.). Finally, the key points are connected together to form graphics lines & meshes, as shown below, to reconstruct image outlines & surface colors. The following figure illustrate these stages:
![Conversion from Raster Image to Vector Graphics][2]
The scope of this post covers converting line drawing images to vector graphics (i.e. reconstructing shape outline only without surface color).
Image to graph
--
It is convenient to use graph object as an intermediate representation of image features, where key points represents graph nodes and image lines are graph links. Once the graph is constructed, it will be easy to replace graph links with the actual curve shape. A good starting point could be from *MorphologicalGraph* functions starting with skeleton image (e.g. ![skeleton image][3] which is the result of *Binarize* and *Thinning* functions applied to the input image). Then modify graph links to have the exact shape of original image.
The process is divided into three steps:
1. Convert image into list of vertices and links
2. Find connection between vertices and links
3. Find edges
**Feature Extraction (Vertices & Links)**
In the first step, we use *SkeletonEndPoints* and *SkeletonBranchPoints* as feature points. They will serve as graph's edges, which will define the shape of our vector graphics. Each vertex is given a unique index as shown below. I tested the function on a simple image, with 3 endpoints and 1 branch point (I should have included a loop for completeness). Each node has given a unique index (I marked vertices in yellow for better visualization). One tricky situation is when we have a circle (or any isolated loop) that doesn't contain any branch. in this case, *MorphologicalTransform* will return an empty result, so we should think of something else. I found *ImageCorners* pretty useful, but it return (x,y)-coordinates instead on (row,column)-pairs. So you need to convert them first before producing an image that can be added with *ImageAdd* to the vertices.
findVertices[skeleton_, opt : OptionsPattern[Graph]] := Module[
{vertices, others},
vertices = ImageAdd[
MorphologicalTransform[skeleton, "SkeletonEndPoints", Padding -> 0],
MorphologicalTransform[skeleton, "SkeletonBranchPoints", Padding -> 0]];
{vertices, others}
];
The result of this function is shown below.
![vertices][4]
Subtracting vertices from the image produces line segments. Those are the links that will connect graph edges. They will be fed to the simplification algorithm in the next stage. Note that each link is given a unique index among links and vertices. Number of segments as well as the length of each segment is also measured.
findLinks[skeleton_, vertices_, vertexCount_] := Module[
{dims = ImageDimensions[skeleton], linkComponents, others},
linkComponents = MorphologicalComponents@ImageSubtract[skeleton, vertices];
linkComponents = Replace[linkComponents, Except[0, n_] :> n + vertexCount, {2}];
{linkComponents, others}
]
![links][5]
**Features to Graph (Links and Nodes)**
The second step is by far the most involving step in the whole process; finding which vertex is connected to which link. Connection can be strong (from four directions) or weak (also include diagonals). There is also some special cases like finding short connections that consist of only 2 pixels, etc.A good starting point is from *ComponentMeasurements* by measuring *"Neighbors"* of vertices and links. To find strong connections you can set *CornerNeighbors* to *False*, while setting it to *True* yields weak connections. A sample output is shown below.
![Strong Connections][6] ![Weak Connections][7]
Note that point 4 and link 6 are strongly-connected. Point 2 is clearly a branch point since it is connected to three links (5,6,7). Now without further details of internal subfunctions involved in this step, I'll show the general function to give you an idea on how complex this step is.
Options[morphologicalGraphics] = Options[Graph];
morphologicalGraphics[skeleton_, opts : OptionsPattern[morphologicalGraphics]] := Module[
{vertices, vertexComponents, linkComponents, strong4, weak8,
redundantEdges, directEdges, linkedEdges, loopEdges, cleanEdges, extraEdges, allEdges, allLines},
{vertices, others} = findVertices[skeleton];
{linkComponents, others} = findLinks[skeleton, vertices, otehrs];
{strong4, others} = findStrongConnections[vertexComponents, linkComponents, others];
{weak8, others} = findWeakConnections[vertexComponents, linkComponents, others];
redundantEdges = findRedundantEdges[strong4, others];
{directEdges, linkedEdges, loopEdges, cleanEdges} = findEdges[weak8,redundantEdges, others];
{extraEdges, others} = findExtraEdges[strong4, linkComponents, weak8, others];
(* convert all edges into list of points *)
allLines = Join[
Replace[List @@@ directEdges, Dispatch@vertexCoordinates, {2}],
Replace[Join[List @@@ cleanEdges, List @@@ extraEdges], Dispatch@Join[vertexCoordinates, linkPositions], {2}]
]
];
The final output of the graph is converted to a list of coordinates defining the lines in the original image. Those points are not necessarily listed in the correct order. Furthermore, not all the points are needed to reproduce the lines segment. This allows more room for line simplification process (next stage) which result in smoother vector graphics, as well as smaller file size.
The following code is to test *morphologicalGraphics* function we show above.
Framed@Module[{x},
x = morphologicalGraphics[skeleton];
Image[skeleton, ImageSize -> Small] ->
Graphics[{Line[x], Red, PointSize[0.03], Point /@ x, Blue,
Point[Join[First /@ x, Last /@ x]]},
PlotLabel -> "Blue edges are correct, \nred points are not in the correct order", ImageSize -> 250]
]
![Result of morphologicalGraphics][8]
----------
## Line Simplefication ##
Douglas-Peucker algorithm is used to simplify curves and lines connecting graph edges (called polylines). Simplification in this context refers to reducing the number of points needed to draw a polyline while maintaining its basic shape. The algorithm assumes points connecting two edges to be in right order (this step is done in *pointsToPath* function). This process is divided into two steps:
1. Rearrange list of points
2. Simplify line segments
**Rearrange points as Path**
In the first step, we'll benefits from the intermediate representation we have. Since we convert raster image into a graph of nodes and links, we can use graph theory to rearrange list of points we have by creating a *DelaunayMesh* then trace the shortest path from the start to the end using *FindHamiltonianPath*.
Two special cases are there: 1) when a starting point meets the end point we have a loop and FindHamiltonianPath fails to find shortest path. One possible solution is by using *FindHamiltonianCycle*. 2) if by any chance we had all points aligned in a straight line, *DelaunayMesh* produces an empty region (since points are aligned in 1D). In this case we simply use *Sort* on the points we have. The following code illustrates this step.
pointsToPath[points2_List] := Module[
{a, b, points, Region, edges, graph, path},
points = points2;
(* Loop detection :TODO:need enhancement using FindHamiltonianCycle *)
If[points[[1]] == points[[-1]],
points = Drop[points, -1];
{a, b} = TakeDrop[points, {Position[points, Last@Nearest[points, points[[1]], 2]][[1, 1]]}];
points = Join[b, a];
];
(* Create a Delaunay mesh *)
Region = DelaunayMesh[points];
If[Region === EmptyRegion[2], Return[Sort@points]];
(* Mesh to graph *)
edges = Map[Sort, MeshCells[Region, 1][[All, 1]]];
graph = Graph[Range[Max@edges], Apply[UndirectedEdge, edges, {1}],
EdgeWeight -> Map[EuclideanDistance @@ points[[#]] &, edges]];
path = FindHamiltonianPath[graph, 1, Length[points]];
Part[points, path]
]
And here is the result.
![rearranged points][9]
**Simplification Step.**
Once the points are in the correct order, we can apply Douglas-Peucker algorithm. Refer to the previous figure, the algorithm will not change blue points (edges), it will simplify red points (segment points) to reconstruct curves that connect blue points with minimum number of red points. The process works as follow:
First, Connect two edges (p1, p2) with line segment and measure distance between all intermediate points (qi) and this line.
pointLineDistance[q_, {p1_, p2_}] := With[
{eqn = (q - p1).(p2 - p1) / (p2 - p1).(p2 - p1)},
Which[
eqn <= 0 , Norm[q - p1],
eqn >= 1 , Norm[q - p2],
True , Norm[q - (p1 + eqn (p2 - p1))]
]
];
Then, if the distance to the furthest point is greater than threshold (smoothness factor), pick it as a new edge and split line into two segments. Otherwise, the segmentation process is done and all other intermediate points can be eliminated.
lineSplit[segment[points_List], threshold_] := Module[
{dists, dmax = 0, pos},
dists = Map[pointLineDistance[#, {points[[1]], points[[-1]]}] &, points[[2 ;; -2]]];
dmax = Max[dists];
If[dmax > threshold,
pos = Position[dists, dmax][[1, 1]] + 1;
{segment[points[[1 ;; pos]]], segment[points[[pos ;;]]]},
segment[points, done]
]
] /; Length[points] > 2
lineSplit[segment[points_List], threshold_] := segment[points, done];
lineSplit[segment[points_List, done], threshold_] := segment[points, done];
Finally, recursively apply segmentation process using *ReplaceRepeated* until all segments are marked *done*. Note that I used *First/@First/@ ...* to access data points in the structure *segment[{points,_}, _]* which is used by *lineSplit* function.
simplifyPolyline[points_List, threshold_] :=
Append[First /@ First /@ Flatten[
ReplaceRepeated[{segment[points]},
s_segment :> lineSplit[s, threshold]]
], Last[points]
];
A simple example illustrates the simplification process is shown bellow. For better approximating curvy shapes, line segments can be used as supporting lines for *BSpline* curves.
![line simplification][10]
----------
## SketchGraphics Function ##
Finally, we are ready to rap everything up in a nice and compact function. As we described in the far beginning, the function accepts a raster line drawing image (skeleton) and produces a vector graphics after converting image into graph then simplify links between nodes.
sketchGraphics[skeleton_Image, polyline_, smoothness_,
opts : OptionsPattern[Graph]] := Module[
{allLines, allPaths},
allLines = morphologicalGraphics[skeleton];
allPaths = Map[ simplifyPolyline[#, smoothness] &,
pointsToPath[#] & /@ allLines];
Graphics[polyline /@ allPaths, Sequence @@ Flatten@{opts}]
];
And here we are. Left: input, Right: output.
![final result][11]
----------
## More Tests ##
![Test 1 PNG][12] == ![Test 1 SVG][13]
![Test 2 PNG][14] == ![Test 2 SVG][15]
![Test 3 PNG][16] == ![Test 3 SVG][17]
![Test 4 PNG][18] == ![Test 4 SVG][19]
![Test 5 PNG][20] == ![Test 5 SVG][21]
![Test 6 PNG][22] == ![Test 6 SVG][23]
----------
## Technical Notes ##
- **Using ImageCorners**:
I mentioned that using this function can help solving loop problem when no branch or endpoint is found. However, some processing is needed before you can add it to vertices image. *ImageCorners* processes list of graphics coordinates (x,y), so you'll need to convert it to matrix coordinates (row, column) then to binary image. The following code is one way to do that (I guess there should be a better way to get rid of the *For-loop*):
vertices = Transpose@ConstantArray[0, ImageDimensions[skeleton]];
height = Last@ImageDimensions[skeleton];
vertexCoordinates = ImageCorners[skeleton, 1, 0.001, 2];
{row, col} = IntegerPart@Apply[{height - (#2 - .5), #1 + .5} &, vertexCoordinates, {1}] // Transpose;
For[i = 1, i <= Length[vertexCoordinates], i++,
vertices[[ row[[i]], col[[i]] ]] = 1;
];
Image[vertices]
![coordinate systems][24]
- **More details on *pointsToPath* function**
In the simplification step, we are required to make sure that points in a line segment are in the correct order so that we can use Douglas-Peucker algorithm. This step involves two operations: constructing a Delaunay mesh and then Finding the Hamiltonian path. the following three figures shows (from left to right) list of points, distorted graphics object, sorted point on the mesh.
![Point to Path][25]
- **From Image to Graphics:**
In case you'd like to see how I created the following illustration, here is the code from Electromagnetics Package
![Conversion from Raster Image to Vector Graphics][26]
<< NDSolve`FEM`;
img[size_] := ColorNegate@Binarize@Rasterize[Style["Hi", size]];
{{
ArrayPlot[
pimg = ReplacePart[
MorphologicalComponents[img[20]], {5, 19} -> 1], Mesh -> All,
ImageSize -> Small, PlotLabel -> "Pixel Image",
ColorRules -> {0 -> White, 1 -> Red, 2 -> Green, 3 -> Blue}],
cimg = ImageCorners[img[700], 0.5];
HighlightImage[ColorNegate@Thinning[EdgeDetect@img[700]],
{PointSize[0.03], Green,
Point /@ cimg[[{1, 2, 13, 14, 3, 4, 20, 19, 9, 8, 18, 17, 1}]],
Blue, cimg[[{5, 6, 16, 11, 22, 21, 10, 15, 7, 12, 5}]],
Red, cimg[[23 ;; -1 ;; 3]]},
ImageSize -> Small, PlotLabel -> "Points to Graphics Lines"],
}, {
toMesh[poly_, color_] :=
MeshRegion[
ToElementMesh[poly, MaxCellMeasure -> 570, AccuracyGoal -> .125],
MeshCellStyle -> {{1, All} -> color}, PlotTheme -> "Lines"];
Labeled[Show[{
toMesh[
Polygon@cimg[[{1, 2, 13, 14, 3, 4, 20, 19, 9, 8, 18, 17, 1}]],
Green],
toMesh[Polygon@cimg[[{5, 6, 16, 11, 22, 21, 10, 15, 7, 12, 5}]],
Blue],
toMesh[Disk[Mean /@ Transpose@cimg[[23 ;;]], 45], Red]}
], "Points to Graphics Mesh", Top, FrameMargins -> 5,
LabelStyle ->
Directive[GrayLevel[.5], FontSize -> 12,
FontFamily -> "Helvetica"]],
Labeled[
gimg = Graphics[{Green,
Polygon@cimg[[{1, 2, 13, 14, 3, 4, 20, 19, 9, 8, 18, 17, 1}]],
Blue, Polygon@cimg[[{5, 6, 16, 11, 22, 21, 10, 15, 7, 12, 5}]],
Red, Disk[Mean /@ Transpose@cimg[[23 ;;]], 45]}],
"Vector Graphics", Top, FrameMargins -> 7,
LabelStyle ->
Directive[GrayLevel[.5], FontSize -> 12,
FontFamily -> "Helvetica"]]
}} // Grid
----------
## Acknowledgment ##
At the end of this post, I'd like to thanks Wolfram Research for allowing this amazing opportunity, and special thanks to my mentors Markus van Almsick and Todd Rowland.And I guess it would be nice to finish this post with this piece of code. This time I converted the whole code into Wolfram Package :)
SetDirectory[NotebookDirectory[]];
Needs["sketchGraphics`"];
skeleton = Thinning@Binarize@EdgeDetect@Rasterize@Style["Thanks Markus !",
FontSize -> 40, FontFamily -> "Bold Italic Art"];
sketchGraphics[skeleton, BSplineCurve, 0.2];
And here is the result !
![Thanks Markus][27]
**References:**
- Mark McClure, "Polyline Simplification", http://demonstrations.wolfram.com/PolylineSimplification/
[1]: http://community.wolfram.com//c/portal/getImageAttachment?filename=1.PNG&userId=884569
[2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=2.PNG&userId=884569
[3]: http://community.wolfram.com//c/portal/getImageAttachment?filename=3.png&userId=884569
[4]: http://community.wolfram.com//c/portal/getImageAttachment?filename=4.PNG&userId=884569
[5]: http://community.wolfram.com//c/portal/getImageAttachment?filename=6.PNG&userId=884569
[6]: http://community.wolfram.com//c/portal/getImageAttachment?filename=7.PNG&userId=884569
[7]: http://community.wolfram.com//c/portal/getImageAttachment?filename=8.PNG&userId=884569
[8]: http://community.wolfram.com//c/portal/getImageAttachment?filename=9.PNG&userId=884569
[9]: http://community.wolfram.com//c/portal/getImageAttachment?filename=10.PNG&userId=884569
[10]: http://community.wolfram.com//c/portal/getImageAttachment?filename=11.PNG&userId=884569
[11]: http://community.wolfram.com//c/portal/getImageAttachment?filename=12.PNG&userId=884569
[12]: http://community.wolfram.com//c/portal/getImageAttachment?filename=13.png&userId=884569
[13]: http://community.wolfram.com//c/portal/getImageAttachment?filename=14.svg&userId=884569
[14]: http://community.wolfram.com//c/portal/getImageAttachment?filename=15.png&userId=884569
[15]: http://community.wolfram.com//c/portal/getImageAttachment?filename=15.svg&userId=884569
[16]: http://community.wolfram.com//c/portal/getImageAttachment?filename=16.png&userId=884569
[17]: http://community.wolfram.com//c/portal/getImageAttachment?filename=16.svg&userId=884569
[18]: http://community.wolfram.com//c/portal/getImageAttachment?filename=17.png&userId=884569
[19]: http://community.wolfram.com//c/portal/getImageAttachment?filename=17.svg&userId=884569
[20]: http://community.wolfram.com//c/portal/getImageAttachment?filename=18.png&userId=884569
[21]: http://community.wolfram.com//c/portal/getImageAttachment?filename=18.svg&userId=884569
[22]: http://community.wolfram.com//c/portal/getImageAttachment?filename=19.png&userId=884569
[23]: http://community.wolfram.com//c/portal/getImageAttachment?filename=19.svg&userId=884569
[24]: http://community.wolfram.com//c/portal/getImageAttachment?filename=98495.PNG&userId=884569
[25]: http://community.wolfram.com//c/portal/getImageAttachment?filename=20.PNG&userId=884569
[26]: http://community.wolfram.com//c/portal/getImageAttachment?filename=2.PNG&userId=884569
[27]: http://community.wolfram.com//c/portal/getImageAttachment?filename=20.svg&userId=884569Sa'di Altamimi2017-08-09T15:13:40ZExtending FindRoot
http://community.wolfram.com/groups/-/m/t/1132423
The function `FindRoot` can only find one root; there is no way of finding multiple roots. User J. M. extended the FindRoot by the function [FindAllCrossings][1] by changing Stan Wagon's book Mathematica in Action slightly.
The implementation is pretty slick. You plot the function on a certain range and tell the `Plot` function to mesh the zeros, then you extract the zeros and use `FindRoot` with them as an initial condition.
But this implementation has a major drawback; it fails in cases where the function doesn't cross zero, as `x^2` for example.
To extend J. M. implementation I added some more lines of code, and I've made the code more verbose. The new lines of code calculate the zeros of the derivative of the function and then test it is close to zero and later refine it with `FindRoot`, this works even with functions as `Abs[x]`, which has discontinuous derivative and never crosses zero, but thanks to the interpolated result, it takes care of that.
Options@FindRoots = Sort@Join[Options@FindRoot, {MaxRecursion -> Automatic, PerformanceGoal :> $PerformanceGoal, PlotPoints -> Automatic, Debug -> False, ZeroTolerance -> 10^-2}];
FindRoots[fun_, {var_, min_, max_}, opts:OptionsPattern[]] := Module[{PlotRules, RootRules, g, g2, pts, pts2, lpts, F, sol},
(* Extract the Options *)
PlotRules = Sequence @@ FilterRules[Join[{opts}, Options@FindRoots], Options@Plot];
RootRules = Sequence @@ FilterRules[Join[{opts}, Options@FindRoots], Options@FindRoot];
(* Plot the function and "mesh" the point with y-coordinate 0 *)
g = Normal@Plot[fun, {var, min, max}, MeshFunctions -> (#2 &), Mesh -> {{0}}, Method -> Automatic, Evaluate@PlotRules];
(* Get the meshes zeros *)
pts = Cases[g, Point[p_] :> SetPrecision[p[[1]], OptionValue@WorkingPrecision], Infinity];
(* Get all plot points *)
lpts = Join@@Cases[g, Line[p_] :> SetPrecision[p, OptionValue@WorkingPrecision], Infinity];
(* Derive the interpolated data to find other zeros *)
F = Interpolation[lpts, InterpolationOrder->2];
g2 = Normal@Plot[Evaluate@D[F@var, var], {var, min, max}, MeshFunctions -> (#2 &), Mesh -> {{0}}, Method -> Automatic, Evaluate@PlotRules];
(* Get the meshes zeros and retain only small ones *)
pts2 = Cases[g2, Point[p_] :> SetPrecision[p[[1]], OptionValue@WorkingPrecision], Infinity];
pts2 = Select[pts2, Abs[F@#] < OptionValue@ZeroTolerance &];
pts = Join[pts, pts2]; (* Join all zeros *)
(* Refine zeros by passing each point through FindRoot *)
If[Length@pts > 0,
pts = Map[FindRoot[fun, {var, #}, Evaluate@RootRules]&, pts];
sol = Union@Select[pts, min <= Last@Last@# <= max &];
(* For debug purposes *)
If[OptionValue@Debug, Print@Show[g, Graphics@{PointSize@0.02, Red, Point[{var, fun} /. sol]}]];
sol
,
If[OptionValue@Debug, Print@g];
{}
]
]
Example:
![enter image description here][2]
My primary use of this function is to find the eigenvalues of dielectric waveguides. They are found as the zeros of the boundary conditions.
[1]: https://mathematica.stackexchange.com/questions/5663/about-multi-root-search-in-mathematica-for-transcendental-equations#answer-5666
[2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=2017-07-01_153306.png&userId=845022Thales Fernandes2017-07-01T18:39:46ZAirline Performance in 2015
http://community.wolfram.com/groups/-/m/t/1161914
![enter image description here][1]
![enter image description here][2]
#Extracting from the Zip Archive#
The data comes in a zip archive that contains three files. One file contains data on 1 million flights that took place in 2015 within the United States. It's a sample from a [Kaggle dataset][3] with over 5 million flights. I randomly sampled that data to keep the memory footprint manageable. (For now, Mathematica does not cope well with data larger than computer memory). Let's see what's in the ZIP archive.
I first import and clean the data.
zipArchive = "C:\\Users\\Seth\\Downloads\\flight-delays\\flights15.zip";
(* you will need to use your own file name here *)
We see there are four files. The file containing the actual data is flightsOneMillion.csv.
filesInArchive = Import[zipArchive]
> {"flightsOneMillion.csv", "L_Airport.csv", "L_Airport_ID.csv",
> "IATA_AirlineCodes.csv"}
Let' s import it. This will take probably 1 - 5 minutes depending on the speed of you system. Be patient! You should end up with a data structure whose dimensions are 1,000,001 x 31 (the extra row is the header).
Dimensions[csv = Import[zipArchive, filesInArchive[[1]]]]
> {1000001, 31}
While we' re at it, let' s import the second and third files. We first import a file mapping conventional IATA airport codes to actual airport names. We then import a file mapping alternative airport codes to the same set of actual airport names.
Dimensions[iataCSV = Import[zipArchive, filesInArchive[[2]]]]
> {6430, 2}
Dimensions[alternativeCSV = Import[zipArchive, filesInArchive[[3]]]]
> {6415, 2}
Finally, we want to import a file mapping airline code abbreviations (like UA) to the actual name of the airline (United Airlines). The fourth file contains this mapping.
Dimensions[airlineCSV = Import[zipArchive, filesInArchive[[4]]]]
> {15, 2}
##Import the Data and Create a Dataset##
Now let' s use one of the programming idioms we have developed to convert the flights CSV data into a Wolfram Language Dataset. One this is done we are going to free up some memory by using Remove on the original csv data.
flights = Dataset[Map[AssociationThread[First@csv, #] &, Rest@csv]];
Remove[csv]
Let' s look at the first 10 rows.
flights[1 ;; 10]
![enter image description here][4]
##Clean the Data##
###Changing the column headers to lower case###
Notice that all the column headers are in upper case. This is a nuisance. Let's change the headers to all be lower case. To do this, remember that each row of a Dataset that has column headers is an Association. So, we use the KeyMap function, which operates on the keys of an Association instead of the values. Again, here and elsewhere, have some patience. This is pretty big data. I just printout the first four rows to make sure I have succeeded.
flights = flights[All, KeyMap[ToLowerCase]];
flights[1 ;; 4]
![enter image description here][5]
###Add the airline name###
Working with airline codes is inconvenient. So, I want to add the actual name of the airline to each row of the Dataset. To do this, I convert the mapping between airline code and airline name into an Association.
airlineCodeAssociation = Association[Apply[Rule, Rest@airlineCSV, {1}]]
![enter image description here][6]
And now let' s use that Association to add the airline name right after the airline. We'll add an "airlineName" column right after the "airline" column and before the "flight_number" column. To do this, we will use Insert, which lets you insert a key-value pair at a designated position in an association.
newpos = 1 + Position[Keys[flights[[1]]], "airline"][[1, 1]]
> 6
flights = Query[All, Insert[#, "airlineName" -> airlineCodeAssociation[#airline], newpos] &][flights]
![enter image description here][7]
##Working with Dates and Times##
The csv file we extracted from the zip archive did not store dates and times in a particularly sensible way. We want to convert at least some of these dates and times so that they have desirable properties. I first create a function getTimeObjectFromField. The function takes an association that has some field in which time is stored as an integer like this 1455. It converts into a TimeObject representing 2:55 pm.
getTimeObjectFromField[a_Association, field_] := TimeObject[{Quotient[a[field], 100], Mod[a[field], 100], 0}]
Now I create three little functions that create new columns, "date," "scheduled_departure_time" and "scheduled_arrival_time" that behave as proper dates and times in the Wolfram Language. Again I use Insert to place the new key-value pairs in the appropriate spot. I would have preferred to do this all with a single Insert rather than a chaining of multiple Insert operations, but this is currently not permitted in the Wolfram Language.
converter1 = (Insert[#, "date" -> DateObject[{#year, #month, #day}],
Key["day_of_week"]]) &;
converter2 = (Insert[#,
"scheduled_departure_time" ->
getTimeObjectFromField[#, "scheduled_departure"],
Key["departure_time"]] &);
converter3 = (Insert[#,
"scheduled_arrival_time" ->
getTimeObjectFromField[#, "arrival_time"],
Key["scheduled_arrival"]] &);
I'll now right compose those conversions and modify the **flights** Dataset.
flights = Query[All, converter1 /* converter2 /* converter3][flights]
![enter image description here][8]
##The horrible October problem##
Let' s look at the distribution of flights by origin airport by month. We'll sort the months and we'll sort the origin airports within each month according to the number of flights that originated from there. The sort is most flights to least flights.
flightsByMonthAndOrigin =
Query[GroupBy[#month &] /* KeySort,
GroupBy[#"origin_airport" &] /* ReverseSort, Length][flights]
![enter image description here][9]
What you can see is that ATL and ORD come out on top except during month 10 (October) when the mysterious airports 10397 and 13930 have the most fights. Those airport codes do not have the typical 3 letters of IATA codes. In fact, what appears to have happened is that a different coding system was used during October.
If we are to do proper analysis, we need for our airport codes to be consistent. We have two files that we've already imported that are going to help us straighten things out. We have iataCSV, which shows the relationship between IATA code and actual airport name, and we have alternativeCSV, which shows the relationship between alternative airport code and actual airport name. Since the two files appear to have a common piece of data -- actual airport name -- we should be able to use a JoinAcross operation to create a mapping between alternative airport code and IATA code. Once we have this mapping we can clean up the data.
Let' s start by creating two lists of Associations.
iataAssociation = Map[AssociationThread[{"IATA_Code", "airportName"}, #] &, Rest@iataCSV]
![enter image description here][10]
alternativeAssociation = Map[AssociationThread[{"alternative_Code", "airportName"}, #] &, Rest@alternativeCSV]
![enter image description here][11]
We can now use a JoinAcross operation to link up the two Associations.
airportCodesAssociation = JoinAcross[iataAssociation, alternativeAssociation, "airportName"]
![enter image description here][12]
We now just create a simple association that goes from alternative code to IATA code.
airportCodeTranslator = Query[Association, Slot["alternative_Code"] -> Slot["IATA_Code"] &][airportCodesAssociation]
![enter image description here][13]
We now write a function that takes an argument s. If s is a key in airportCodeTranslator, the function outputs the value for that key (an IATA code). If not, the function outputs s (the original code).
airportTranslate[s_] := Lookup[airportCodeTranslator, s, s]
We can now fix the October problem.
flights =
Query[All,
Association[#,
"origin_airport" -> airportTranslate[#"origin_airport"],
"destination_airport" ->
airportTranslate[#"destination_airport"]] &][flights]
![enter image description here][14]
Let' s rerun our code that examines airports by months and see if the problem is gone. All the origin and destination airports now have three letter codes. The October problem appears to be fixed!
Normal@flights[DeleteDuplicates, #"origin_airport" &]
![enter image description here][15]
Normal@flights[DeleteDuplicates, #"destination_airport" &]
![enter image description here][16]
##Getting rid of flights for which we have no arrival delay information##
We have one last piece of clean up to do. Most of our analysis from here on in is going to deal with arrival delays. Rather than rewrite every piece of code to Select only those flights for which we have such information, let's create a new Dataset that filters out flights missing this critical information. Moreover, there are only some of the columns that we are going to deal with. For example, I really don't care about the tail number of the flight. So we can reduce the memory footprint of the Dataset. We will call the new dataset by the short name **ds**.
wantedColumns = {"date", "month", "day_of_week", "airline",
"airlineName", "origin_airport", "destination_airport",
"scheduled_departure_time", "scheduled_arrival_time",
"arrival_delay"};
ds = Query[Select[NumericQ[#"arrival_delay"] &], wantedColumns][flights]
![enter image description here][17]
#Analysis#
##Late Arrivals and Creation the fractionLateAssociation function##
Let' s write a fairly complex function that takes a list of potential delay thresholds and determines the fraction of flights that are above each of those values. We'll know for example what fraction of flights have arrival delays of more than 5 minutes and 15 minutes and 60 minutes. We're going to do this using something called an **EmpiricalDistribution** and a **SurvivalFunction**. We have not seen these yet, but just accept that they work.
Let' s create a list of delay thresholds about which we might care.
delays = {0, 5, 10, 15, 30, 60, 120}
Here' s our function. It is called **lateness**. There are two optional arguments. The **keyfunction** argument specifies the form of the keys for the association. The **postProcess** argument permits one to do things such as round the results. (Probably these should be written as formal options rather than simple optional arguments).
lateness[thresholds_, keyFunction_: ("late" <> ToString[#] &),
postProcess_: (Round[N[#], 0.001] &)][arrivalDelays_] :=
Module[{ed = EmpiricalDistribution[arrivalDelays], sd},sf = SurvivalFunction[ed];
AssociationThread[Map[keyFunction, thresholds], postProcess[Map[sf, thresholds]]]
]
Here' s an example showing how the function works. I set my thresholds to 0 and 5. I then receive arrival data, a list of 10 numbers. Notice that 60% of the data is above 0 and 30% of the data is above 5.
lateness[{0, 5}][{-3, -4, 0, 0, 1, 2, 4, 9, 11, 13}]
> <|"late0" -> 0.6, "late5" -> 0.3|>
We' ll now use this function in our query. At the "whole shebang" first level I perform the fractionLate Association function. And at the second level (a bunch of Associations) I extract the "arrival_delay" column.
fractionLateAssociation[delays, #, ("late_by_" <> ToString[#] &)] &
arrivalLateness = Query[lateness[delays], #"arrival_delay" &][ds]
![enter image description here][18]
So, it turns out that in 2015 about 18% of flights were late by more than 15 minutes. 5.6% of flights were more than an hour late. But 63.4% of flights were on time or early.
##Late Arrivals By Month##
We can now determine these values for each month. Here I group the data by month and then treat each resulting piece the same as I treated the whole Dataset in the code immediately above.We wrap the results of our Query with a KeySort so that the months are in order. What we can see is that February and June are actually the worst months for delays and that October is probably the best month in which to fly (on average).
Query[GroupBy[#month &] /* KeySort, lateness[delays], Slot["arrival_delay"] &][ds]
![enter image description here][19]
I can also determine how lateness varies according to the day of the week. The results show that Saturday is the best day to fly with Thursday generally being the worst.
Query[GroupBy[#"day_of_week" &] /* KeySort, lateness[delays],
Slot["arrival_delay"] &][ds]
![enter image description here][20]
##Late arrivals by airport##
Our Dataset uses airport codes instead of airport names. Often the latter is more convenient. I thus build a little Association that we can use to translate from 3-letter airport codes to airport names.
iatamap = Association[Rule @@@ Rest@iataCSV]
![enter image description here][21]
Here' s an example showing how our iatamap Association works.
iatamap["IAH"]
> "Houston, TX: George Bush Intercontinental/Houston"
This same programming pattern can be used to compute the breakdown of lateness by originating airport. We'll only use airports that have more than 250 originating flights. To do this, I first group by "origin_airport" and then select only those groups that have more than 1000 rows associated with them. The code is then essentially the same as when I grouped by month.
byOriginAirport =
Query[GroupBy[#"origin_airport" &] /* Select[Length[#] > 1000 &] /*
KeySort, lateness[delays, ("origin_late" <> ToString[#] &)],
Slot["arrival_delay"] &][ds]
![enter image description here][22]
We can now produce a Dataset in which the origin airport is sorted according to the fraction of flights that are at least 15 minutes late and in which the origin airport is given its full name. I show the top 10 entries. We can see that the Hawaiian airports do well as does Akron, Ohio, Portland, Oregon and Salt Lake City, Utah. Anchorage, Alaska does surprisingly well given the sometimes challenging weather that prevails there.
Query[1 ;; 10][
Query[SortBy[Slot["origin_late15"] &] /* KeyMap[iatamap]][
byOriginAirport]]
![enter image description here][23]
And here are the ten worst origin airports: O'Hare, White Plains, New York, Baton Rouge and Dallas, Love Field.
Query[1 ;; 10][
Query[SortBy[Slot["origin_late15"] &] /* Reverse /* KeyMap[iatamap]][
byOriginAirport]]
![enter image description here][24]
We can also look at matters by destination airport. Again, the Hawaiian airports (LIH, KOA, ITO, OGG) do well, along with Salt Lake City.
byDestinationAirport =
Query[GroupBy[#"destination_airport" &] /*
Select[Length[#] > 1000 &] /* KeySort,
lateness[delays, ("destination_late" <> ToString[#] &)],
Slot["arrival_delay"] &][ds]
![enter image description here][25]
Here are the 10 best airports when serving as a destination. Again, many of the Hawaiian island airports do well. But so does Salt Lake City and Atlanta.
Query[1 ;; 10][
Query[SortBy[Slot["destination_late15"] &] /* KeyMap[iatamap]][
byDestinationAirport]]
![enter image description here][26]
The worst airports on the receiving end are Colorado Springs (snow?), Laguardia Airport, Little Rock, Fayetteville, Knoxville and White Plains. It is unclear why the two Arkansas airports have problems as the area is known neither for particularly bad weather or congestion.
Query[1 ;; 10][
Query[SortBy[Slot["destination_late15"] &] /* Reverse /*
KeyMap[iatamap]][byDestinationAirport]]
![enter image description here][27]
We can also find the worst routes by origin, destination and airline. I use some fancy formatting code so that the result is not too wide. It turns out Delta's flight from San Francisco to Los Angeles and the flight back from Los Angeles to San Francisco are the worst, running more than 15 minutes late over 37% of the time. Several American Eagle flights in and out of Laguardia also have serious problems as does Southwest Airlines flights between Los Angeles and Phoenix.
byOriginDestinationAirline =
Query[GroupBy[{#"origin_airport", #"destination_airport", \
#airlineName} &] /* Select[Length[#] > 200 &] /* KeySort,
lateness[delays], Slot["arrival_delay"] &][ds];
Query[(1 ;; 20) /*
KeyMap[Style[{iatamap[#[[1]]], iatamap[#[[2]]], #[[3]]},
9] &], #"late15" &][
Query[SortBy[#late15 &] /* Reverse][byOriginDestinationAirline]]
![enter image description here][28]
##Comparison of lateness with airport as origin airport and airport as destination airport##
We can also compare lateness as an originating airport with lateness as a receiving airport. The code here is a bit challenging. But observe. First we make the city a column of the dataset.
asOriginAndDestination =
Dataset[JoinAcross[
KeyValueMap[Association[#2, "city" -> #1] &,
Normal[byOriginAirport]],
KeyValueMap[Association[#2, "city" -> #1] &,
Normal[byDestinationAirport]], "city"]][All,
KeyTake[#, {"city", "origin_late15", "destination_late15"}] &]
We can do something pretty fancy with this data. We can plot for each airport the fraction of flights that are more than 15 minutes late when it serves as an originating airport (x) versus the fraction of flights that are more than 15 minutes late when it serves as a destination airport (y). The points are labeled with the associated airport. One can superimpose on this graphic lines showing the mean number of flights late by more than 15 minutes. Flights in the northeast quadrant of the plot such as LGA (Laguardia) and DEN (Denver) are worse than average. Flights above the blue line such as COS (Colorado Springs) or SFO (San Francisco) are worse as a destination airport than as an origin airport. Flights below the blue line such PHL (Philadelphia) and BTR (Baton Rouge) are worse as an origin airport than as a destination airport. Flights in the northwest quadrant such as BDL (Bradley in Hartford Connecticut) and ALB (Albany, New York) are worse than average as a destination airport but better than average as an origin airport. Flights in the southwest quadrant such as ANC (Anchorage, Alaska) and ATL (Atlanta) are better than average both as a origin and destination airport. And the few flights in the southeast quadrant such as CLT (Charlotte, North Carolina) and MDW (Midway, Chicago) are worse than average as an origin airport but better than average as a destination airport. One can also see from this graphic that the Hawaiian island airports dominate in terms of avoiding both arrival delays whether serving as the originating or destination airport.
Module[{data = Normal[asOriginAndDestination[All, Values]], lp, g},
lp = ListPlot[Part[data, All, {2, 3}], Axes -> False, Frame -> True,
FrameLabel -> {"as origin", "as destination"},
PlotRange -> {0.05, 0.28},
PlotLabel -> "Fraction of flights more than 15 minutes late",
PlotRangePadding -> 0.01];
g = Graphics[
Map[Text[Style[#[[1]], 7], {#[[2]] + 0.003, #[[3]] + 0.002}] &,
data]];
Show[lp, g, Plot[{x, 0.17898}, {x, 0, 1}],
Graphics[{Orange, InfiniteLine[{{0.17898, -10}, {0.17898, 10}}]}]]
]
![enter image description here][29]
##Lateness by Time of Scheduled Departure##
Let' s find out how lateness in arrival depends on the scheduled time of departure. We can use similar code to that we used to see how lateness varies by month. We can extract the departure hour using the DateValue function with "Hour" as its second argument.
arrivalDelayByScheduledDepartureTime =
Query[GroupBy[DateValue[#"scheduled_departure_time", "Hour"] &] /*
KeySort, lateness[delays], Slot["arrival_delay"] &][ds]
![enter image description here][30]
Let' s plot the result.
figureScheduledDeparture =
With[{plottingFunction = (ListLinePlot[#, Axes -> False,
Frame -> True,
FrameLabel -> {"scheduled departure time",
"fraction of flights at least this late"},
PlotLegends -> delays,
PlotLabel ->
"Fractions of Late Flights As A Function of Scheduled \
Departure Time\nData from 2015", ImageSize -> 600, BaseStyle -> 12,
PlotTheme -> "Web"] &)},
Transpose[arrivalDelayByScheduledDepartureTime][
Values /* plottingFunction, Values]
]
![enter image description here][31]
So, time of day matters for scheduled departure matters hugely in determining whether one is going to be late or not. Flights leaving at 5 or 6 in the morning are far more likely to be on time than flights leaving at around 8 p.m., by which time prior delays have cascaded.
##Analysis By Airline##
Now, let' s see how lateness varies by airline. Again, most of our coding work is done. We just have to change how we group the data.
arrivalDelayByAirline =
KeySort[Query[GroupBy[#"airlineName" &], lateness[delays],
Slot["arrival_delay"] &][ds]]
![enter image description here][32]
I now sort the data according to whether the flights were more than 15 minutes late. We can see there is a big difference. Hawaiian Airlines does best, perhaps owing to our 50th state's generally fabulous weather, but Alaska Airlines does extremely well too, even though many of its flights go into more challenging destinations. Spirit Airlines (a budget carrier) and Frontier Airlines (also sometimes regarded as a budget carrier) do worst. Of the major carriers, Delta does best by a significant margin and United Airlines does worst. The ordering does not vary greatly based on which particular lateness metric one uses.
arrivalDelayByAirlineSorted =
Query[SortBy[#late15 &]][arrivalDelayByAirline]
![enter image description here][33]
Let' s also try something of possible interest. Let' s first find the 10 most busy origin airports.
busyAirports =
Query[GroupBy[#"origin_airport" &] /* Sort /*
Reverse /* (Take[#, UpTo[10]] &), Length][ds]
![enter image description here][34]
Now let' s see how the airlines rank when their flights originate from these airports. This will be our most complicated analysis. First let's get the most busy airports, filter our dataset down to those airports and then group the data by airport. We'll then group each of the airports by airlines and for each calculate the fraction more than 15 minutes late. We'll sort each of the airports by the lateness fraction from worst to best.
busyAirportKeys = Normal@Keys@busyAirports
![enter image description here][35]
complexQuery =
Query[(Select[
MemberQ[busyAirportKeys, #"origin_airport"] &]) /* \
(GroupBy[{#"origin_airport"} &]),
GroupBy[#airlineName &] /* ReverseSort, lateness[{15}],
Slot["arrival_delay"] &]
![enter image description here][36]
byOriginAirportAndAirlineLateness = complexQuery[ds]
![enter image description here][37]
We can now print out the results for each of the airports.
Column@Table[
Framed[Labeled[byOriginAirportAndAirlineLateness[i],
First@Normal@Keys[byOriginAirportAndAirlineLateness][[i]],
Top]], {i, Length[byOriginAirportAndAirlineLateness]}]
![enter image description here][38]
The results show immense variation in lateness even when the departure airport is the same. Hubness matters. From Houston's Bush Intercontinental Airport (IAH), for example, Spirit Airlines is late 33.4% of the time whereas the large American Airlines is late only 14.8% of the time. In Atlanta, United is late 21.3% of the time whereas Delta is late only 12.7% of the time. In Denver, Alaska Airlines is late only 11.1% of the time whereas JetBlue is late 28.5% of the time.
#Conclusions#
Mathematica can handle large datasets reasonably well.
You can learn a lot about airline performance based on the Kaggle 2015 dataset.
[1]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-08-10at2.33.12PM.png&userId=11733
[2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-08-10at2.30.10PM.png&userId=11733
[3]: https://www.kaggle.com/usdot/flight-delays
[4]: http://community.wolfram.com//c/portal/getImageAttachment?filename=sadfgr5yw6utejhfgnbsdvs.png&userId=11733
[5]: http://community.wolfram.com//c/portal/getImageAttachment?filename=rgfhe657iuykjhdfgnbs.png&userId=11733
[6]: http://community.wolfram.com//c/portal/getImageAttachment?filename=64033.png&userId=20103
[7]: http://community.wolfram.com//c/portal/getImageAttachment?filename=asdf3456yetjhdfgsdaa.png&userId=11733
[8]: http://community.wolfram.com//c/portal/getImageAttachment?filename=dfgfsrstyw45ytrhgsfds.png&userId=11733
[9]: http://community.wolfram.com//c/portal/getImageAttachment?filename=58216.png&userId=20103
[10]: http://community.wolfram.com//c/portal/getImageAttachment?filename=39687.png&userId=20103
[11]: http://community.wolfram.com//c/portal/getImageAttachment?filename=25768.png&userId=20103
[12]: http://community.wolfram.com//c/portal/getImageAttachment?filename=102349.png&userId=20103
[13]: http://community.wolfram.com//c/portal/getImageAttachment?filename=966110.png&userId=20103
[14]: http://community.wolfram.com//c/portal/getImageAttachment?filename=dsfg5467rutyjdhgsf.png&userId=11733
[15]: http://community.wolfram.com//c/portal/getImageAttachment?filename=650112.png&userId=20103
[16]: http://community.wolfram.com//c/portal/getImageAttachment?filename=334713.png&userId=20103
[17]: http://community.wolfram.com//c/portal/getImageAttachment?filename=305114.png&userId=20103
[18]: http://community.wolfram.com//c/portal/getImageAttachment?filename=792615.png&userId=20103
[19]: http://community.wolfram.com//c/portal/getImageAttachment?filename=1048016.png&userId=20103
[20]: http://community.wolfram.com//c/portal/getImageAttachment?filename=647217.png&userId=20103
[21]: http://community.wolfram.com//c/portal/getImageAttachment?filename=129718.png&userId=20103
[22]: http://community.wolfram.com//c/portal/getImageAttachment?filename=19.png&userId=20103
[23]: http://community.wolfram.com//c/portal/getImageAttachment?filename=20.png&userId=20103
[24]: http://community.wolfram.com//c/portal/getImageAttachment?filename=21.png&userId=20103
[25]: http://community.wolfram.com//c/portal/getImageAttachment?filename=22.png&userId=20103
[26]: http://community.wolfram.com//c/portal/getImageAttachment?filename=23.png&userId=20103
[27]: http://community.wolfram.com//c/portal/getImageAttachment?filename=24.png&userId=20103
[28]: http://community.wolfram.com//c/portal/getImageAttachment?filename=25.png&userId=20103
[29]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-08-10at2.30.10PM.png&userId=11733
[30]: http://community.wolfram.com//c/portal/getImageAttachment?filename=27.png&userId=20103
[31]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ScreenShot2017-08-10at2.33.12PM.png&userId=11733
[32]: http://community.wolfram.com//c/portal/getImageAttachment?filename=29.png&userId=20103
[33]: http://community.wolfram.com//c/portal/getImageAttachment?filename=30.png&userId=20103
[34]: http://community.wolfram.com//c/portal/getImageAttachment?filename=31.png&userId=20103
[35]: http://community.wolfram.com//c/portal/getImageAttachment?filename=32.png&userId=20103
[36]: http://community.wolfram.com//c/portal/getImageAttachment?filename=33.png&userId=20103
[37]: http://community.wolfram.com//c/portal/getImageAttachment?filename=34.png&userId=20103
[38]: http://community.wolfram.com//c/portal/getImageAttachment?filename=35.png&userId=20103Seth Chandler2017-08-10T18:43:13Z[WSC17] Visualizing shapes inside Moire patterns using Mathematica
http://community.wolfram.com/groups/-/m/t/1161905
Greetings,
The focus of this post is going to be how we can use the Wolfram language and Mathematica to better visualize the shapes present in a given Moire pattern. First, we will briefly review what Moire patterns are, and how to build one in Mathematica. Then, we will look at the different functions that allow us to see the shapes inside Moire patterns better. Lastly, we will use these functions to create interesting Manipulate and Cloud objects.
Introduction to Moire patterns
-------------------------------------------
The basic definition of a Moire pattern is: A pattern that can be observed when a ruled pattern with is overlaid with another rotated/displaced version of the same (or similar) pattern. They are not only mesmerizing to watch, but have real life uses as well. For example in Physics, where a pattern is projected on a surface. The surface is then modified, and so the projected pattern changes. Both, the new and the original pattern are overlaid. Scientists can look at the resulting Moire pattern to determine the change in the surface. Here is a an example of a Moire Pattern:
![Moire Pattern built using Mathematica][1]
This example, built with Mathematica, shows a set of equidistant points, overlaid by the same set, rotated 70 Degrees.
The code to do so was:
pts = Table[Point[{x, y}], {x, -7, 7}, {y, -7, 7}];
Graphics[{pts, Rotate[pts, 70 Degree]}]
*First, we generate the set of points (pts), and then we rotate it using the function Rotate.*
In Mathematica you can not only generate moire patterns based on sets of points, but you can do so with other shapes as well.
Let's take lines for example:
lines = Table[Line[{{x, 0}, {x, 1}}], {x, 0, 1, 0.05}];
Graphics@Table[Rotate[lines, r], {r, \[Pi]/5, \[Pi], \[Pi]/5}]
The resulting Moire Pattern would be:
![Linear Moire created using Mathematica][2]
In this particular example, we generated a table of sets of lines, instead of just using 2. There is no special reason to it, it's just so it looks better. You can do it with just two sets aswell:
lines = Table[Line[{{x, 0}, {x, 1}}], {x, 0, 1, 0.05}];
Graphics[{lines, Rotate[lines, 20 Degree]}]
![Linear Moire created using Mathematica][3]
You can also play around with Manipulate and Moire patterns. A cool example would be:
points = Table[Point[{x, y}], {x, -10, 10}, {y, -10, 10}];
Manipulate[Graphics[Table[Rotate[Style[points, Hue[a]], a], {a, a, 4 a, \[Pi]/16}]], {a, 0, \[Pi]/4}]
*Beware: This Manipulate may require quite a bit of computing*
The result should look something like this:
![Moire "Flower" generated in Mathematica][4]
Making shapes in Moire patterns more visible
--------------------------------------------
Mathematica has an array of different functions used to analyze images and graphics. I explored a few of them and found out that the best functions to visualize the shapes inside Moire patterns were SkeletonTransform and VoronoiMesh. Also, I looked at Histogram3D to see how the density of the points in a Moire pattern changed over time.
Let's start with the function Skeleton. Before using it, we must convert the graph into an image, because Skeleton only takes images as Arguments. To do so, we just use Rasterize
graph=With[{points = Table[Point[{x, y}], {x, -15, 15}, {y, -15, 15}]}, Graphics[{points, Rotate[points, 70]}]];
img=Rasterize[graph]
*Note: Instead of defining pts to do the graph, we can use the function With, as used in the last code*
Now that we have an image, we can use SkeletonTransform to get the shapes. We will also crop the image so it's easier to see the shapes.
SkeletonTransform[ImageCrop[img, Round[ImageDimensions[img]/2]]] // ImageAdjust
The output should look like this:
![Skeleton of a Moire pattern][5]
SkeletonTransform does a good enough job representing the shapes, but there is another function that represents them better/clearer. It is called VoronoiMesh and it generates a Mesh, consisting of cells around the points it is given. To use this function, we need to use another Method of rotating the points. So, instead of using Rotate as we did before, we will use RotationTransform to get the coordinates of the rotated points. We will crop the image as we did with SkeletonTransform, just so the shapes are clearer.
points = Table[Point[{x, y}], {x, -10, 10}, {y, -10, 10}];
r = RotationTransform[\[Pi]/4];
rotatedpoints = points /. Point -> r;
finalpoints = rotatedpoints /. {x_, y_} -> Point[{x, y}];
setofpoints = Join[points, finalpoints];
densitypoints = setofpoints /. Point[{x_, y_}] -> {x, y};
flat = Flatten[densitypoints, 1];
i = VoronoiMesh[flat, PlotTheme -> "Lines"];
image = Rasterize[i];
ImageCrop[image, Round[ImageDimensions[image]/2]] // ImageAdjust
The output should look like this:
![Voronoi Mesh][6]
*Note: Try different angles to get different shapes. (The angle in this code is determined by the argument of RotationTransform)*
I talked about how we could use Histogram3D to see how the density of the points changed with each angle. There are two ways to approach this. You can make a function and manually change the angle to get each Histogram3D for each given angle, or you could do a Manipulate to slide and see how it changes. I took the second approach and added a graph of the Moire pattern in itself to better visualize the change.
Here is the code for the Manipulate:
Manipulate[
Module[
{MoireHistogram, graph},
MoireHistogram[a_] :=
Module[{points, r, rotatedpoints, finalpoints, setofpoints,
densitypoints, flat2},
points = Table[Point[{x, y}], {x, -d, d}, {y, -d, d}];
r := RotationTransform[a];
rotatedpoints := points /. Point -> r;
finalpoints := rotatedpoints /. {x_, y_} -> Point[{x, y}];
setofpoints := Join[points, finalpoints];
densitypoints := setofpoints /. Point[{x_, y_}] -> {x, y};
flat2 := Flatten[densitypoints, 1];
If[gd,
Histogram3D[flat2, Automatic,
ChartElementFunction -> "GradientScaleCube"],
Histogram3D[flat2]]
];
graph[a_] :=
With[{points = Table[Point[{x, y}], {x, -d, d}, {y, -d, d}]},
Graphics[{points, Rotate[points, a]}]];
{Magnify[graph[a], 1.5], Magnify[MoireHistogram[a], 2]}],
{{a, \[Pi]/4, "Angle"}, 0, \[Pi]/2},
{{d, 4, "Density"}, 2, 15, 1},
{{gd, False, "Gradient Scale Cubes"}, {True, False}}
]
And here is a snapshot of the end product:
![Snapshot of a Moire Pattern and its' 3D Histogram][7]
Interesting Manipulate and Cloud objects
----------------------------------------
We have already seen some Manipulate objects, but there is more to it. For example, with Mathematica, you can create a Manipulate that lets you change the angle, point density and color of a VoroniMesh from a moire pattern, all in the same Manipulate:
Manipulate[
{Module[
{points, r, rotatedpoints, finalpoints, setofpoints, densitypoints,
flat},
points = Table[Point[{x, y}], {x, -d, d}, {y, -d, d}];
r = RotationTransform[a];
rotatedpoints = points /. Point -> r;
finalpoints = rotatedpoints /. {x_, y_} -> Point[{x, y}];
setofpoints = Join[points, finalpoints];
densitypoints = setofpoints /. Point[{x_, y_}] -> {x, y};
flat = Flatten[densitypoints, 1];
If[showpoints,
i = Graphics[{co,
MeshPrimitives[VoronoiMesh[flat, PlotTheme -> "Lines"], 1],
co2, Point[flat]}],
i = Graphics[{co,
MeshPrimitives[VoronoiMesh[flat, PlotTheme -> "Lines"], 1]}]];
image = ImageResize[Rasterize[i, ImageResolution -> 100], "Large"];
Magnify[ImageCrop[image, Round[ImageDimensions[image]/2]], 3]
]},
{{a, \[Pi]/4, "Angle"}, 0, 1.569},
{{d, 4, "Density"}, 2, 17, 1},
{{showpoints, False, "Show Voronoi Points"}, {True, False}},
{{co, Black, "Line Color"}, Black},
{{co2, Orange, "Point Color"}, Orange}
]
And the corresponding snapshot:
![Voroni Manipulate object from Mathematica][8]
Moving on to the Cloud objects, we can create a Microsite that receives an angle and a point density, to return a Moire pattern and its Voronoi Mesh. To do so we need to first define a function, then a FormFucntion and lastly CloudDeploy that FormFunction.
func1[a_, d_] :=
Module[{MoireVoD, graphD},
MoireVoD[a, d] :=
Module[{points, r, rotatedpoints, finalpoints, setofpoints,
densitypoints, flat},
points = Table[Point[{x, y}], {x, -d, d}, {y, -d, d}];
r = RotationTransform[a];
rotatedpoints = points /. Point -> r;
finalpoints = rotatedpoints /. {x_, y_} -> Point[{x, y}];
setofpoints = Join[points, finalpoints];
densitypoints = setofpoints /. Point[{x_, y_}] -> {x, y};
flat = Flatten[densitypoints, 1];
i = VoronoiMesh[flat, PlotTheme -> "Lines"];
image = Rasterize[i];
ImageCrop[image, Round[ImageDimensions[image]/2]] // ImageAdjust
];
graphD[a, d] :=
With[{points = Table[Point[{x, y}], {x, -d, d}, {y, -d, d}]},
Graphics[{points, Rotate[points, a]}]];
{graphD[a, d], MoireVoD[a, d]}]
form1 = FormFunction[{"a" -> "Real", "d" -> "Integer"},
func1[#a, #d] &,
AppearanceRules -> <|"Title" -> "Moire and Voronoi",
"Description" ->
"Enter an Angle (a) between 0 and 1.569 (Radians) and a point \
density (d) between 4 and 17 to receive a Moire Pattern with its' \
corresponding Voronoi Mesh"|>]
CloudDeploy[form1, "Moire and Voronoi", Permissions -> "Public"]
You can click [here ][9] to go to the Microsite.
Conclusions
-----------
We have seen how Mathematica has different ways to display the same data, in this case, how it can display the information given by a Morie pattern. I am sure that I missed a couple of functions, so, if you have any suggestions for which other functions might represent Moire patterns better, feel free to discuss them in the reply section. Also, if you think anything in this post could have been done better, do not hesitate and reply. All feedback is greatly appreciated.
[1]: http://community.wolfram.com//c/portal/getImageAttachment?filename=8508Moire.png&userId=1138553
[2]: http://community.wolfram.com//c/portal/getImageAttachment?filename=LinearMoires.png&userId=1138553
[3]: http://community.wolfram.com//c/portal/getImageAttachment?filename=LinearMoires2.png&userId=1138553
[4]: http://community.wolfram.com//c/portal/getImageAttachment?filename=MoireMan.png&userId=1138553
[5]: http://community.wolfram.com//c/portal/getImageAttachment?filename=Skeletonsmol.png&userId=1138553
[6]: http://community.wolfram.com//c/portal/getImageAttachment?filename=Voronoi1.png&userId=1138553
[7]: http://community.wolfram.com//c/portal/getImageAttachment?filename=ManipHisto.PNG&userId=1138553
[8]: http://community.wolfram.com//c/portal/getImageAttachment?filename=MoireVor.PNG&userId=1138553
[9]: https://www.wolframcloud.com/objects/user-6b332570-e883-46c3-9c26-d2277ce655ad/Moire%20and%20Voronoi%20Alpha/%22Here%22Oleg Rouvinski2017-08-10T18:17:07Z