In September 2020, UC Berkeley’s Rausser College of Natural Resources selected the Kellylab for a Sponsored Project for Undergraduate Research (SPUR) project for their proposal entitled “Mapping municipal funding for police in California.” The project partnered with Mapping Black California (MBC), the Southern California-based collective that incorporates technology, data, geography, and place-based study to better understand and connect African American communities in California. We met weekly during the fall semester and gathered data from 472 cities across California, detailing the per-capita police funding and percent of municipal budget that is spent on police departments. California has nearly 500 incorporated cities and most municipalities have their own police departments and create an annual budget determining what percentage their police department will receive. The variability in police spending across the state is quite surprising - check out the figures below. The average percentage of municipal budgets spent on policing is about 20%, and while some municipalities spent less than 5% of their budgets on policing, others allocated more than half of their budgets to their police departments. Per capita police spending is on average about $500, but varies largely from about $10 to well over $2,000. If you are interested in this project, explore our findings through the Story Map: examine Southern California and the Bay Area in detail, check out a few interesting cities, or search for a city and click on it to see just how much they spent on policing in 2017.

Figure showing variability in Police Spending (% of municipal budget) in Northern California in 2017. Data from California State Controller's Cities Finances Data, 2017 (City and police spending information). For more information see the Story Map here.

Figure showing variability in Police Spending (PEr capita) in Northern California in 2017. Data from California State Controller's Cities Finances Data, 2017 (City and police spending information). For more information see the Story Map here.
The analyst on the project has been Ben Satzman, a UC Berkeley Conservation and Resource Studies Major with minors in Sustainable Environmental Design and GIS, who worked in collaboration with MBC and the Kellylab to find, clean, visualize, and analyze statewide data. We plan on continuing the project to explore the possible influences (such as racial diversity, crime, poverty, ethnicity, income, and education) underlying these regional trends and patterns in police spending. Personnel involved in the project are: from Mapping Black California - Candice Mays (Partnership Lead), Paulette Brown-Hinds (Director), Stephanie Williams (Exec Editor, Content Lead), and Chuck Bibbs (Maps and Data Lead); from the Kellylab: Maggi Kelly (Professor and CE Specialist), Chippie Kislik (Graduate Student), Christine Wilkinson (Graduate Student), and Annie Taylor (Graduate Student).
We thank the Rausser College of Natural Resources who funded this effort.
/figcaption>/figcaption>My usual update from ESRI UC is a bit tougher this year, since I am working from home, and on one screen. So note taking is a bit rough. And I kind of miss the whole razz-ma-tazz of Day 1 on site. But here goes:
OK, so Jack’s Plenary is the always place to get a big view of new releases in software. 2020 is no different. A sweep of the software improvements coming this summer. Here is my (very) quick summary of highlights. Items with ‘*’ are those that will be useful in class (I hope).
The Conference Theme is Collaboration
What’s coming in ArcGIS Pro and AGOL:
Data:
New layers
Better integration with OSM*
Visualization (New Map Apps* - launching this fall):
Beta now in AGOL
Better styling, better color ramps, and better dynamic interaction with color ramp *
Dot density mapping!
Clustering and new labeling
Filtering data
Some cool color blending as an alternative to transparency! *
Cartography in Pro:
Charts
Story Maps
Optimize for mobile
Collections
StoryTeller role
Spatial Analytics and Data Science
New suitability modeling tool *
Spider diagrams
Modeling
AI, Big Data, ML
Jupyter Notebooks inside of ArcGIS Pro *
AGOL implementing Jupyter Notebooks
Imagery and Remote Sensing
Image management - ready to use workflows and content
Feature extraction
Analytics - classification, etc.
Something called “Excalibur” - web-based exploitation. Search and find, feature extraction, add to a database
Drone Mapping *
Drone2Map on desktop
Site Scan - cloud-based solutions
3D Mapping
Jack loves voxels
Real-time Analytics
Cloud-based sensor data storage and management
Data Management
Improving editing: 2D and 3D editing improvements *
Field Maps App
In beta, and should streamline things.
And Enterprise runs on kubernetes…
All leading up to ArcGIS 2021 next year.
OK deep breath, off for a lunch break.
Every fall I ask my GIS students to answer the big questions in advance of their class projects. This year climate change, wildlife conservation, land use and water quality are important, as well as a number of other topics. Remote sensing continues to be important to GISers. Scientists, government and communities need to work together to solve problems.
Why?
- What does the proposed project hope to accomplish?
- What is the problem that needs to be addressed?
- What do you expect to happen?
How?
- What analysis approach will be used?
- Why was this approach selected?
- What are alternative methods?
- Is the analysis reproducible?
What?
- What are the datasets that are needed?
- Where will they come from?
- Have you downloaded and checked this dataset?
- Do you have a backup dataset?
Who?
- Who will care about this? And why?
- How will they use the results?
- Will they be involved in the entire workflow?
Here are the responses from Fall 2017:




So much to learn! Here is my distillation of the main take-homes from last week.
Notes about the workshop in general:
- Making participants do organized homework and install software is necessary for complicated workshop content: http://neondataskills.org/workshop-event/NEON-Data-Insitute-2017
- NEON used tips from software carpentry workshops, particularly the green-and-pink sticky tag tip. When you put up a pink sticky on your computer, you need help; when you put up a green sticky on your computer, you are all good. This makes everything go smoother and means participants don't have to hold up their hand to get attention.
- Having lots of helpful, friendly faces around to help during the coding bits, and having access to the code when you got behind was critical.
- The workshop content and daily schedule:
NEON data and resources:
- http://www.neonscience.org/resources/data-tutorials
- All the NEON airborne data can be found here: http://www.neonscience.org/data/airborne-data
- For more on when data rolls out, sign up for the NEON eNews here: http://www.neonscience.org/
Other misc. tools:
- For cleaning messy data - check out OpenRefine - a FOS tool for cleaning messy data http://openrefine.org/
- Excel is cray-cray, best practices for spreadsheets: http://www.datacarpentry.org/spreadsheet-ecology-lesson/
- Morpho (from DataOne) to enter metadata: https://www.dataone.org/software-tools/morpho
- Pay attention to file size with your git repositories - check out: https://git-lfs.github.com/. Git is good for things you do with your hands (like code), not for large data.
- Markdown renderer: http://dillinger.io/
- MIT License, like Creative Commons for code: https://opensource.org/licenses/MIT
- There is a new project called "Feather" that allows compatibility between python and R: https://blog.rstudio.org/2016/03/29/feather/
- All the NEON airborne data can be found here: http://www.neonscience.org/data/airborne-data
- Information on the TIFF specification and TIFF tags here: http://awaresystems.be/, however their TIFF Tag Viewer is only for windows.
- All NEON point cloud classifications are done with LASTools. Go LASTools! https://rapidlasso.com/lastools/
- Check out pdal - like gdal for point clouds. It can be used from bash. Learned from my workshop neighbor Sergio Marconi https://www.pdal.io/
- Reflectance Tarps are made by GroupVIII http://www.group8tech.com/
- ATCOR http://www.rese.ch/products/atcor/ says we should be able to rely on 3-5% error on reflectance when atmospheric correction is done correctly (say that 10 times fast) with a well-calibrated instrument.
- NEON hyperspectral data is stored in HDF5 format. HDFView is a great tool for interrogating the metadata, among other things. https://support.hdfgroup.org/products/java/hdfview/
First of all, Pearl Street Mall is just as lovely as I remember, but OMG it is so crowded, with so many new stores and chains. Still, good food, good views, hot weather, lovely walk.
Welcome to Day 2! http://neondataskills.org/data-institute-17/day2/
Our morning session focused on reproducibility and workflows with the great Naupaka Zimmerman. Remember the characteristics of reproducibility - organization, automation, documentation, and dissemination. We focused on organization, and spent an enjoyable hour sorting through an example messy directory of misc data files and code. The directory looked a bit like many of my directories. Lesson learned. We then moved to working with new data and git to reinforce yesterday's lessons. Git was super confusing to me 2 weeks ago, but now I think I love it. We also went back and forth between Jupyter and python stand alone scripts, and abstracted variables, and lo and behold I got my script to run.
The afternoon focused on Lidar (yay!) and prior to coding we talked about discrete and waveform data and collection, and the opentopography (http://www.opentopography.org/) project with Benjamin Gross. The opentopography talk was really interesting. They are not just a data distributor any more, they also provide a HPC framework (mostly TauDEM for now) on their servers at SDSC (http://www.sdsc.edu/). They are going to roll out a user-initiated HPC functionality soon, so stay tuned for their new "pluggable assets" program. This is well worth checking into. We also spent some time live coding with Python with Bridget Hass working with a CHM from the SERC site in California, and had a nerve-wracking code challenge to wrap up the day.
Fun additional take-home messages/resources:
- ISO International standard for dates = YYYY-MM-DD
- Missing values in R = NA, in Python = -9999
- For cleaning messy data - check out OpenRefine - a FOS tool for cleaning messy data http://openrefine.org/
- Excel is cray-cray, best practices for spreadsheets: http://www.datacarpentry.org/spreadsheet-ecology-lesson/
- Morpho (from DataOne) to enter metadata: https://www.dataone.org/software-tools/morpho
- Pay attention to file size with your git repositories - check out: https://git-lfs.github.com/. Git is good for things you do with your hands (like code), not for large data.
- Funny how many food metaphors are used in tech teaching: APIs as a menu in a restaurant; git add vs git commit as a grocery cart before and after purchase; finding GIS data is sometimes like shopping for ingredients in a specialty grocery store (that one is mine)...
- Markdown renderer: http://dillinger.io/
- MIT License, like Creative Commons for code: https://opensource.org/licenses/MIT
- "Jupyter" means it runs with Julia, Python & R, who knew?
- There is a new project called "Feather" that allows compatibility between python and R: https://blog.rstudio.org/2016/03/29/feather/
- All the NEON airborne data can be found here: http://www.neonscience.org/data/airborne-data
- Information on the TIFF specification and TIFF tags here: http://awaresystems.be/, however their TIFF Tag Viewer is only for windows.
Thanks for everyone today! Megan Jones (our fearless leader), Naupaka Zimmerman (Reproducibility), Tristan Goulden (Discrete Lidar), Keith Krause (Waveform Lidar), Benjamin Gross (OpenTopography), Bridget Hass (coding lidar products).

Our home for the week