WolframAlpha.com
WolframCloud.com
All Sites & Public Resources...
Products & Services
Wolfram|One
Mathematica
Wolfram|Alpha Notebook Edition
Programming Lab
Finance Platform
SystemModeler
Wolfram Player
Wolfram Engine
WolframScript
Enterprise Private Cloud
Enterprise Mathematica
Wolfram|Alpha Appliance
Enterprise Solutions
Corporate Consulting
Technical Consulting
Wolfram|Alpha Business Solutions
Resource System
Data Repository
Neural Net Repository
Function Repository
Wolfram|Alpha
Wolfram|Alpha Pro
Problem Generator
API
Data Drop
Products for Education
Mobile Apps
Wolfram Player
Wolfram Cloud App
Wolfram|Alpha for Mobile
Wolfram|Alpha-Powered Apps
Services
Paid Project Support
Wolfram U
Summer Programs
All Products & Services »
Technologies
Wolfram Language
Revolutionary knowledge-based programming language.
Wolfram Cloud
Central infrastructure for Wolfram's cloud products & services.
Wolfram Science
Technology-enabling science of the computational universe.
Wolfram Notebooks
The preeminent environment for any technical workflows.
Wolfram Engine
Software engine implementing the Wolfram Language.
Wolfram Natural Language Understanding System
Knowledge-based broadly deployed natural language.
Wolfram Data Framework
Semantic framework for real-world data.
Wolfram Universal Deployment System
Instant deployment across cloud, desktop, mobile, and more.
Wolfram Knowledgebase
Curated computable knowledge powering Wolfram|Alpha.
All Technologies »
Solutions
Engineering, R&D
Aerospace & Defense
Chemical Engineering
Control Systems
Electrical Engineering
Image Processing
Industrial Engineering
Mechanical Engineering
Operations Research
More...
Finance, Statistics & Business Analysis
Actuarial Sciences
Bioinformatics
Data Science
Econometrics
Financial Risk Management
Statistics
More...
Education
All Solutions for Education
Trends
Machine Learning
Multiparadigm Data Science
Internet of Things
High-Performance Computing
Hackathons
Software & Web
Software Development
Authoring & Publishing
Interface Development
Web Development
Sciences
Astronomy
Biology
Chemistry
More...
All Solutions »
Learning & Support
Learning
Wolfram Language Documentation
Fast Introduction for Programmers
Wolfram U
Videos & Screencasts
Wolfram Language Introductory Book
Webinars & Training
Summer Programs
Books
Need Help?
Support FAQ
Wolfram Community
Contact Support
Premium Support
Paid Project Support
Technical Consulting
All Learning & Support »
Company
About
Company Background
Wolfram Blog
Events
Contact Us
Work with Us
Careers at Wolfram
Internships
Other Wolfram Language Jobs
Initiatives
Wolfram Foundation
MathWorld
Computer-Based Math
A New Kind of Science
Wolfram Technology for Hackathons
Student Ambassador Program
Wolfram for Startups
Demonstrations Project
Wolfram Innovator Awards
Wolfram + Raspberry Pi
Summer Programs
More...
All Company »
Search
WOLFRAM COMMUNITY
Connect with users of Wolfram technologies to learn, solve problems and share ideas
Join
Sign In
Dashboard
Groups
People
Message Boards
Answer
(
Unmark
)
Mark as an Answer
GROUPS:
Staff Picks
Authoring and Publishing
Data Science
Curated Data
Dynamic Interactivity
Graphics and Visualization
Wolfram Cloud
5
David Lomiashvili
City-level Search Tool for Coronavirus (COVID-19) Confirmed Cases
David Lomiashvili, Wolfram Research Inc
Posted
1 year ago
2214 Views
|
2 Replies
|
5 Total Likes
Follow this post
|
MODERATOR NOTE: coronavirus resources & updates:
https://wolfr.am/coronavirus
I wanted to make a cloud applet that would allow users to enter the name of a city and receive a number of confirmed cases in it. Data quality, key outputs and their visualisations were going to be important for usefulness of the tool. I used CloudDeploy and FormPage to make an interactive search tool that queries “Patient Medical Data for Novel Coronavirus COVID-19” from Wolfram Data Repository.
R
o
w
[
{
H
y
p
e
r
l
i
n
k
[
"
O
P
E
N
C
L
O
U
D
A
P
P
"
,
U
R
L
[
"
h
t
t
p
s
:
/
/
w
w
w
.
w
o
l
f
r
a
m
c
l
o
u
d
.
c
o
m
/
o
b
j
/
d
a
v
i
d
l
0
/
C
o
r
o
n
a
v
i
r
u
s
T
o
o
l
"
]
,
F
r
a
m
e
M
a
r
g
i
n
s
1
0
,
B
a
s
e
S
t
y
l
e
W
h
i
t
e
,
A
c
t
i
v
e
S
t
y
l
e
B
l
a
c
k
,
B
a
c
k
g
r
o
u
n
d
R
e
d
,
A
p
p
e
a
r
a
n
c
e
"
P
a
l
e
t
t
e
"
]
,
T
e
x
t
[
"
[
N
O
T
E
:
L
o
n
g
l
o
a
d
t
i
m
e
s
]
"
]
}
]
O
P
E
N
C
L
O
U
D
A
P
P
[
N
O
T
E
:
L
o
n
g
l
o
a
d
t
i
m
e
s
]
O
u
t
[
]
=
Objectives & Motivation
There are a few things that (almost) equally contributed to my motivation for doing this project:
I found it weird that basically no news source or a search engine provided a number of confirmed cases on a city level - which is arguably a more important measure to people than a State-level number. It’d be foolish to disregard the importance of commute and other types of daily movement of people between cities but I’d argue that what impacts people’s behavior and state of mind most is the severity of spread in their locality - city, town, etc.
[UPDATE: Since I wrote this post, NYT released a dataset for county-level data]
1
.
There are so many great resources and content that we (Wolfram Research) is producing around this topic, making interesting computational explorations, essays and deployed applets becomes super straightforward and easy (COVID-19 data is right there in the Wolfram Data Repository (WDR), ready to be ingested with just one line of code);
2
.
After reading so many insightful posts by our community members, I got bitten by the contribution bug (sorry for a bad pun);
3
.
I’ve always wanted to post on our Community and given my family’s current state - Sheltered-in-place, I finally had free time to take on a pet-project using WL;
4
.
Project Breakdown
A great thing about this project is that the data is basically provided on a platter (by WDR), no need to clean and wrangle it for hours; All we need to do is to do some data exploration and then count relevant cases, albeit the counting will require using some sophisticated functionality of WL. Overall project involves importing, exploring and computing on data and eventually deploying a data “product”.
I will be describing the following stages of the project in this post:
Import the data
◼
Explore the data and assess what is achievable
◼
Map out the computations and output visualisations
◼
Deploy and automate
◼
Data Import
Having aggregated patient data available in a computable form in the WDR was one of the main reasons I was able to finish this tool in a weekend (I’m not the fastest when it comes to coding).
So, let’s import the dataset object from WDR:
c
o
v
i
d
d
a
t
a
o
b
j
e
c
t
=
R
e
s
o
u
r
c
e
O
b
j
e
c
t
[
"
P
a
t
i
e
n
t
M
e
d
i
c
a
l
D
a
t
a
f
o
r
N
o
v
e
l
C
o
r
o
n
a
v
i
r
u
s
C
O
V
I
D
-
1
9
"
]
R
e
s
o
u
r
c
e
O
b
j
e
c
t
N
a
m
e
:
P
a
t
i
e
n
t
M
e
d
i
c
a
l
D
a
t
a
f
o
r
N
o
v
e
l
C
o
r
o
n
a
v
i
r
u
s
C
O
V
I
D
-
1
9
»
T
y
p
e
:
D
a
t
a
R
e
s
o
u
r
c
e
D
e
s
c
r
i
p
t
i
o
n
:
M
e
d
i
c
a
l
r
e
c
o
r
d
s
o
f
p
a
t
i
e
n
t
s
i
n
f
e
c
t
e
d
w
i
t
h
n
o
v
e
l
c
o
r
o
n
a
v
i
r
u
s
C
O
V
I
D
-
1
9
O
u
t
[
]
=
WDR supports auto-updating of resource objects but if it fails we may need to manually delete the object and re-initialize it.
D
e
l
e
t
e
O
b
j
e
c
t
[
c
o
v
i
d
d
a
t
a
o
b
j
e
c
t
]
Just to make sure the data was successfully imported, let’s do a quick examination of the dataset using a RandomSample of 10:
c
o
v
i
d
d
a
t
a
=
R
e
s
o
u
r
c
e
D
a
t
a
[
c
o
v
i
d
d
a
t
a
o
b
j
e
c
t
]
;
I
n
[
]
:
=
R
a
n
d
o
m
S
a
m
p
l
e
[
c
o
v
i
d
d
a
t
a
,
1
0
]
A
g
e
S
e
x
C
i
t
y
A
d
m
i
n
i
s
t
r
a
t
i
v
e
D
i
v
i
s
i
o
n
C
o
u
n
t
r
y
G
e
o
P
o
s
i
t
i
o
n
D
a
t
e
O
f
O
n
s
e
t
S
y
m
p
t
o
m
s
D
a
t
e
O
f
A
d
m
i
s
s
i
o
n
H
o
s
p
i
t
a
l
D
a
t
e
O
f
C
o
n
f
i
r
m
a
t
i
o
n
S
y
m
p
t
o
m
s
L
i
v
e
s
I
n
W
u
h
a
n
L
i
v
e
s
I
n
W
u
h
a
n
C
o
m
m
e
n
t
T
r
a
v
e
l
H
i
s
t
o
r
y
D
a
t
e
s
T
r
a
v
e
l
H
i
s
t
o
r
y
L
o
c
a
t
i
o
n
R
e
p
o
r
t
e
d
M
a
r
k
e
t
E
x
p
o
s
u
r
e
R
e
p
o
r
t
e
d
M
a
r
k
e
t
E
x
p
o
s
u
r
e
C
o
m
m
e
n
t
C
h
r
o
n
i
c
D
i
s
e
a
s
e
Q
C
h
r
o
n
i
c
D
i
s
e
a
s
e
s
S
e
q
u
e
n
c
e
A
v
a
i
l
a
b
l
e
D
i
s
c
h
a
r
g
e
d
Q
D
e
a
t
h
Q
D
a
t
e
O
f
D
e
a
t
h
D
a
t
e
O
f
D
i
s
c
h
a
r
g
e
—
—
X
i
a
n
g
f
a
n
H
u
b
e
i
,
C
h
i
n
a
C
h
i
n
a
3
1
.
9
3
°
N
1
1
1
.
9
°
E
—
—
S
a
t
1
F
e
b
2
0
2
0
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
H
a
m
e
d
a
n
,
I
r
a
n
I
r
a
n
3
4
.
8
7
°
N
4
8
.
5
5
°
E
—
—
T
u
e
3
M
a
r
2
0
2
0
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
T
a
e
g
u
-
g
w
a
n
g
y
o
k
s
i
,
S
o
u
t
h
K
o
r
e
a
S
o
u
t
h
K
o
r
e
a
3
5
.
7
8
°
N
1
2
8
.
6
°
E
—
—
S
a
t
2
2
F
e
b
2
0
2
0
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
W
u
h
a
n
H
u
b
e
i
,
C
h
i
n
a
C
h
i
n
a
3
0
.
6
3
°
N
1
1
4
.
3
°
E
—
—
S
a
t
8
F
e
b
2
0
2
0
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
X
i
a
o
g
a
n
H
u
b
e
i
,
C
h
i
n
a
C
h
i
n
a
3
1
.
1
2
°
N
1
1
3
.
9
°
E
—
—
S
a
t
1
F
e
b
2
0
2
0
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
W
u
h
a
n
H
u
b
e
i
,
C
h
i
n
a
C
h
i
n
a
3
0
.
6
3
°
N
1
1
4
.
3
°
E
—
—
S
u
n
1
6
F
e
b
2
0
2
0
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
W
u
h
a
n
H
u
b
e
i
,
C
h
i
n
a
C
h
i
n
a
3
0
.
6
3
°
N
1
1
4
.
3
°
E
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
N
e
i
m
e
n
g
g
u
,
C
h
i
n
a
C
h
i
n
a
4
0
.
2
1
°
N
1
0
9
.
9
°
E
—
—
M
o
n
3
F
e
b
2
0
2
0
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
W
u
h
a
n
H
u
b
e
i
,
C
h
i
n
a
C
h
i
n
a
3
0
.
6
3
°
N
1
1
4
.
3
°
E
—
—
S
a
t
8
F
e
b
2
0
2
0
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
—
W
u
h
a
n
H
u
b
e
i
,
C
h
i
n
a
C
h
i
n
a
3
0
.
6
3
°
N
1
1
4
.
3
°
E
—
—
M
o
n
2
7
J
a
n
2
0
2
0
—
—
—
—
—
—
—
—
—
—
—
—
—
—
O
u
t
[
]
=
Looks like we got the data.
Next, let’s see what type of information is actually in it.
Data Exploration
Data Fields
As we can see from the column headers of the dataset in the previous section (and by extracting Keys of the dataset), there are 23 data fields.
c
o
v
i
d
d
a
t
a
[
1
]
/
/
K
e
y
s
A
g
e
S
e
x
C
i
t
y
A
d
m
i
n
i
s
t
r
a
t
i
v
e
D
i
v
i
s
i
o
n
C
o
u
n
t
r
y
G
e
o
P
o
s
i
t
i
o
n
D
a
t
e
O
f
O
n
s
e
t
S
y
m
p
t
o
m
s
D
a
t
e
O
f
A
d
m
i
s
s
i
o
n
H
o
s
p
i
t
a
l
D
a
t
e
O
f
C
o
n
f
i
r
m
a
t
i
o
n
S
y
m
p
t
o
m
s
L
i
v
e
s
I
n
W
u
h
a
n
L
i
v
e
s
I
n
W
u
h
a
n
C
o
m
m
e
n
t
T
r
a
v
e
l
H
i
s
t
o
r
y
D
a
t
e
s
T
r
a
v
e
l
H
i
s
t
o
r
y
L
o
c
a
t
i
o
n
R
e
p
o
r
t
e
d
M
a
r
k
e
t
E
x
p
o
s
u
r
e
R
e
p
o
r
t
e
d
M
a
r
k
e
t
E
x
p
o
s
u
r
e
C
o
m
m
e
n
t
C
h
r
o
n
i
c
D
i
s
e
a
s
e
Q
C
h
r
o
n
i
c
D
i
s
e
a
s
e
s
S
e
q
u
e
n
c
e
A
v
a
i
l
a
b
l
e
D
i
s
c
h
a
r
g
e
d
Q
D
e
a
t
h
Q
D
a
t
e
O
f
D
e
a
t
h
D
a
t
e
O
f
D
i
s
c
h
a
r
g
e
O
u
t
[
]
=
This is great though we won’t be using most of them for this projects purposes. We will primarily be utilizing location and temporal data from the dataset.
How up-to-date is our data
A very important factor for usefulness of our tool will be how current the underline data is. We can find this out by checking the maximum for “DateOfConfirmation” values.
c
o
v
i
d
d
a
t
a
[
M
a
x
,
#
D
a
t
e
O
f
C
o
n
f
i
r
m
a
t
i
o
n
&
]
D
a
y
:
F
r
i
1
3
M
a
r
2
0
2
0
O
u
t
[
]
=
Not quite real-time but within a 1 day window.
Location Data
Let’s examine just the location fields:
c
o
v
i
d
d
a
t
a
[
A
l
l
,
{
"
C
i
t
y
"
,
"
A
d
m
i
n
i
s
t
r
a
t
i
v
e
D
i
v
i
s
i
o
n
"
,
"
C
o
u
n
t
r
y
"
,
"
G
e
o
P
o
s
i
t
i
o
n
"
}
]
C
i
t
y
A
d
m
i
n
i
s
t
r
a
t
i
v
e
D
i
v
i
s
i
o
n
C
o
u
n
t
r
y
G
e
o
P
o
s
i
t
i
o
n
H
e
f
e
i
A
n
h
u
i
,
C
h
i
n
a
C
h
i
n
a
3
1
.
6
5
°
N
1
1
7
.
7
°
E
H
e
f
e
i
A
n
h
u
i
,
C
h
i
n
a
C
h
i
n
a
3
1
.
7
8
°
N
1
1
7
.
3
°
E
H
e
f
e
i
A
n
h
u
i
,
C
h
i
n
a
C
h
i
n
a
3
1
.
8
3
°
N
1
1
7
.
2
°
E
H
e
f
e
i
A
n
h
u
i
,
C
h
i
n
a
C
h
i
n
a
3
1
.
8
3
°
N
1
1
7
.
2
°
E
H
e
f
e
i
A
n
h
u
i
,
C
h
i
n
a
C
h
i
n
a
3
2
.
0
°
N
1
1
7
.
6
°
E
L
u
a
n
A
n
h
u
i
,
C
h
i
n
a
C
h
i
n
a
3
1
.
7
6
°
N
1
1
6
.
3
°
E
F
u
y
a
n
g
A
n
h
u
i
,
C
h
i
n
a
C
h
i
n
a
3
2
.
9
2
°
N
1
1
5
.
7
°
E
H
u
a
i
b
e
i
A
n
h
u
i
,
C
h
i
n
a
C
h
i
n
a
3
3
.
7
3
°
N
1
1
6
.
7
°
E
H
u
a
i
n
a
n
A
n
h
u
i
,
C
h
i
n
a
C
h
i
n
a
3
2
.
7
6
°
N
1
1
6
.
7
°
E
H
e
f
e
i
A
n
h
u
i
,
C
h
i
n
a
C
h
i
n
a
3
1
.
7
9
°
N
1
1
7
.
3
°
E
L
u
a
n
A
n
h
u
i
,
C
h
i
n
a
C
h
i
n
a
3
1
.
7
6
°
N
1
1
6
.
3
°
E
F
u
y
a
n
g
A
n
h
u
i
,
C
h
i
n
a
C
h
i
n
a
3
2
.
9
2
°
N
1
1
5
.
7
°
E
A
n
q
i
n
g
A
n
h
u
i
,
C
h
i
n
a
C
h
i
n
a
3
0
.
6
1
°
N
1
1
6
.
6
°
E
C
h
i
z
h
o
u
A
n
h
u
i
,
C
h
i
n
a
C
h
i
n
a
3
0
.
2
9
°
N
1
1
7
.
4
°
E
B
e
n
g
b
u
A
n
h
u
i
,
C
h
i
n
a
C
h
i
n
a
3
3
.
1
1
°
N
1
1
7
.
3
°
E
—
B
e
i
j
i
n
g
,
C
h
i
n
a
C
h
i
n
a
4
0
.
2
1
°
N
1
1
6
.
2
°
E
—
B
e
i
j
i
n
g
,
C
h
i
n
a
C
h
i
n
a
4
0
.
2
1
°
N
1
1
6
.
2
°
E
—
B
e
i
j
i
n
g
,
C
h
i
n
a
C
h
i
n
a
4
0
.
2
1
°
N
1
1
6
.
2
°
E
—
B
e
i
j
i
n
g
,
C
h
i
n
a
C
h
i
n
a
3
9
.
6
4
°
N
1
1
6
.
4
°
E
—
B
e
i
j
i
n
g
,
C
h
i
n
a
C
h
i
n
a
3
9
.
6
4
°
N
1
1
6
.
4
°
E
s
h
o
w
i
n
g
1
–
2
0
o
f
4
1
9
2
6
O
u
t
[
]
=
So, it seems that some of the entries are missing the City field, this may be due to the fact that they are associated with rural areas (or even non-land areas, e.g. cruise ships) or because the data is incomplete. The good thing is that we can get the precise GeoPosition data which we will use to identify the city/municipality associated with the case.
Next, let’s derive expressions we need to extract the insights we are interested in.
Outputs & Visualisation
Date
As discussed earlier, it’s important we indicate the date when the dataset was last updated:
c
o
v
i
d
d
a
t
a
[
M
a
x
,
#
D
a
t
e
O
f
C
o
n
f
i
r
m
a
t
i
o
n
&
]
D
a
y
:
F
r
i
1
3
M
a
r
2
0
2
0
O
u
t
[
]
=
Case count
Given that our goal is to make a cloud applet that would allow a user to enter the name of a city and receive a number of confirmed cases in it, and that “City” column is missing a lot of values, we’ll need to find another way of finding a corresponding “City” value that’ll work for each case.
We can do this using GeoWithinQ function that returns True or False depending on whether the point (given by GeoPosition) is within a polygon of a given entity (in this case a City). So, we’ll check if the confirmed case occurred in the city based on its GeoPosition and do it for each row. We can reduce the amount of Geo-computation needed by first filtering all of the cases for the Country that contains the City and do the row-by-row check with GeoWithinQ.
Let’s assume we are querying the data for Chicago, IL:
E
n
t
i
t
y
V
a
l
u
e
C
h
i
c
a
g
o
C
I
T
Y
,
"
C
o
u
n
t
r
y
"
U
n
i
t
e
d
S
t
a
t
e
s
O
u
t
[
]
=
N
o
r
m
a
l
@
Q
u
e
r
y
S
e
l
e
c
t
#
C
o
u
n
t
r
y
E
n
t
i
t
y
V
a
l
u
e
C
h
i
c
a
g
o
C
I
T
Y
,
"
C
o
u
n
t
r
y
"
&
,
"
G
e
o
P
o
s
i
t
i
o
n
"
@
c
o
v
i
d
d
a
t
a
G
e
o
P
o
s
i
t
i
o
n
4
1
.
8
7
8
1
,
-
8
7
.
6
2
9
8
,
G
e
o
P
o
s
i
t
i
o
n
4
8
.
0
4
8
2
,
-
1
2
1
.
6
9
6
,
G
e
o
P
o
s
i
t
i
o
n
3
3
.
7
0
3
3
,
-
1
1
7
.
7
6
1
,
G
e
o
P
o
s
i
t
i
o
n
3
4
.
0
5
,
-
1
1
8
.
2
5
,
G
e
o
P
o
s
i
t
i
o
n
3
3
.
4
1
2
8
,
-
1
1
1
.
9
4
3
,
G
e
o
P
o
s
i
t
i
o
n
4
1
.
8
7
8
1
,
-
8
7
.
6
2
9
8
,
G
e
o
P
o
s
i
t
i
o
n
3
7
.
2
3
1
7
,
-
1
2
1
.
6
9
3
,
G
e
o
P
o
s
i
t
i
o
n
4
2
.
3
6
0
1
,
-
7
1
.
0
5
8
9
,
G
e
o
P
o
s
i
t
i
o
n
3
7
.
2
3
1
7
,
-
1
2
1
.
6
9
3
,
G
e
o
P
o
s
i
t
i
o
n
3
6
.
6
0
6
4
,
-
1
2
1
.
0
7
4
,
⋯
1
5
2
8
⋯
,
G
e
o
P
o
s
i
t
i
o
n
3
7
.
3
8
0
2
,
-
1
1
9
.
6
7
8
,
G
e
o
P
o
s
i
t
i
o
n
3
7
.
3
8
0
2
,
-
1
1
9
.
6
7
8
,
G
e
o
P
o
s
i
t
i
o
n
3
7
.
3
8
0
2
,
-
1
1
9
.
6
7
8
,
G
e
o
P
o
s
i
t
i
o
n
3
7
.
3
8
0
2
,
-
1
1
9
.
6
7
8
,
G
e
o
P
o
s
i
t
i
o
n
3
7
.
3
8
0
2
,
-
1
1
9
.
6
7
8
,
G
e
o
P
o
s
i
t
i
o
n
3
7
.
3
8
0
2
,
-
1
1
9
.
6
7
8
,
G
e
o
P
o
s
i
t
i
o
n
3
7
.
3
8
0
2
,
-
1
1
9
.
6
7
8
,
G
e
o
P
o
s
i
t
i
o
n
3
7
.
3
8
0
2
,
-
1
1
9
.
6
7
8
,
G
e
o
P
o
s
i
t
i
o
n
3
7
.
3
8
0
2
,
-
1
1
9
.
6
7
8
l
a
r
g
e
o
u
t
p
u
t
s
h
o
w
l
e
s
s
s
h
o
w
m
o
r
e
s
h
o
w
a
l
l
s
e
t
s
i
z
e
l
i
m
i
t
.
.
.
O
u
t
[
]
=
This gives us GeoPositions for every US case. Now, we need to check which ones belong to Chicago area.
C
o
u
n
t
G
e
o
W
i
t
h
i
n
Q
C
h
i
c
a
g
o
C
I
T
Y
,
N
o
r
m
a
l
@
Q
u
e
r
y
S
e
l
e
c
t
#
C
o
u
n
t
r
y
E
n
t
i
t
y
V
a
l
u
e
C
h
i
c
a
g
o
C
I
T
Y
,
"
C
o
u
n
t
r
y
"
&
,
"
G
e
o
P
o
s
i
t
i
o
n
"
@
c
o
v
i
d
d
a
t
a
,
T
r
u
e
9
O
u
t
[
]
=
Perfect! So, we are able to find City values based on GeoPositions.
In terms of visualisation of results, we can construct a simple but effective display using rows and columns:
R
a
s
t
e
r
i
z
e
R
o
w
C
o
l
u