My wrap-up from a very engaged and provocative 1.5 day workshop on geospatial technology futures, hosted by the CyberGIS Center: “Towards a National Geospatial Software Ecosystem”. First: great group of cool peeps all hyper-engaged in geospatial data, tools, use cases, science, and community. Second: fun to be involved in big-picture thinking on what a geospatial software institute might look like if it was to be built from scratch. Finally, I was on the panel discussing core questions bridging use cases and core technical capabilities, and I share my reflections of the workshop here.
- Question 1. Are there any significant gaps between the use cases and core technical capabilities that GSI should address?
- Training needs: beyond GIS training – “spatial data science” training, for K-12; undergrad; graduate; veterans; professionals
- Easy ways to get access to cloud storage and computation, and for different datasets like UAVs. There are examples like CyVerse (from Tyson Swetnam) and others
- Data integration: Data assimilation, Data fusion, Sensor triangulation.
- Whatever you want to call it – this remains a challenge for geospatial experts and beginners alike. And it is especially a challenge when you work across disciplines (e.g. the work of SESYNC from Mary Shelley and Margaret Palmer, SESYNC, University of Maryland)
- Dynamics: Spatio-temporal and real-time data streams: sensor networks, social media, cube sats
- in space (e.g. the new Antarctic DEM from Paul Morin, University of Minnesota);
- in time (e.g. cubesats, sensor networks; social media);
- in depth?: going under-ground (from Debra Laefer, NYU)
- We love FAIR for data. What about FAIR for tools: make tools Findable, Accessible, Interoperable, and Re-usable
- Question 2: What does the CyberGIS Geographic Software Institute (GSI) need to do to address community needs and contribute to the national CyberInfrastructure ecosystem?
- Link strongly with existing diversity-supporting frameworks: HBCU; community colleges; tribes; networks such as @WomenWhoCode, @LadiesOfLandsat, @BlackGirlsCode, @500womensci, @RLadiesGlobal, etc.
- More of these workshops! Multi-disciplinary meetings of people with tight/packed agendas and make use of workshop attendees between workshops; what can we do to spread the word
- Create GSI Data Institute or Bootcamp or Faculty Education Mentoring Network
- Support standards for data and software standards to promote interoperability
- Support frameworks for data and software discovery and interoperability: FAIR for data; FAIR for tools
Conclusion: Super Fun. Learned a Ton. Plus parting words from Michael Goodchild: It is not location that matters, it is context. Location provides context; context allows integration: with data, between disciplines, between people, between tools. "Let's get above the layers".
As always, the Plenary session was an immersive and emotional showcase of the power of mapping. Running through Monday’s talks was a sense of urgency for we GIS people to save the world. This is what JD calls “societal GIS”, or “embracing the digital transformation and leverage the science of where”. Shane and I had a great time. Some key news from the Plenary:
ESRI is in every K-12 school in the US; JD announced it will be offered to every K-12 school in the world.
The work of Thomas Crowther, Professor of Global Ecosystem Ecology at ETH Zürich (@crowthelab) is inspirational. They estimate 3T trees globally, with room for 1T more. (See paper here.) Gonna be checking out his tree data on the Living Atlas (global maps of tree density, diversity, carbon uptake, and reflectance).
A great demo from JD Irving, a private Canadian forestry, transportation and products company heavy into sustainability and GIS. All there properties are managed using ArcGIS + R.
ESRI is showcasing some key "Solution Configurations" that are bundled software products focused on high-priority areas such as: 1) community engagement ("Hub"); 2) interior spaces ("Indoors") and 3) smart cities ("Urban"). The highlighted snazzy urban planning 3D vis tools will be giving UrbanSim a run for their money. Might we work RUCS2.0 into a "Solution Configuration" for working landscape planning?
Plus some highlights of what I learned overall:
Wow. ESRI's Living Atlas of the World has some amazing resources. Living Atlas is ESRI’s curated web data portal that links seamlessly with Pro. It has tons of data on environment and imagery. Want Sentinel-2 imagery, NAIP, or MODIS thermal? Want global climate and weather data? Want to easily play with Open Street Map or other vector tiles within your GIS project? It is all in the Living Atlas. This will be a game changer for class. Plus TC’s tree data. Gonna be checking this out.
Unstructured data can be added to your workflow now, this is text, etc. This is big.
Offering access to Open Street Map within Pro.
Software updates (mostly about Pro)
Pro is the way to go, but ESRI will continue to support ArcMap “for years to come”
New stuff in ArcGIS Pro related to Image Analysis:
Sensor support has been expanded; plus new formats supported, eg. netcdf. Pro supports mosaic datasets, they call mosaics the optimum data model for image management.
ESRI is now supporting “oriented” imagery - StreetView Imagery, oblique imagery, etc. Easily integrate things like iPhone photos within your Pro project. They call this working in “image space” rather than “map space”.
Ortho Mapping within ESRI has 3 solutions: Drone2Map (stand-alone software), within ArcGIS Pro (using the Image Server license), and OrthoMaker (web interface).
New release of Pro has full motion video support. (Upcoming releases will have more deep learning algorithms, multi-patch editing in stereo, and pixel editing.)
There are so many cool things going on on the imagery front in Pro, makes me excited.
New stuff in ArcGIS Pro in general:
Adding an unstructured data format - e.g. text!
3D editing and 3D voxel support.
Machine Learning is increasingly embedded in ESRI workflows, and when that is not enough, ML is also possible via linkages with external resources (via R, TensorFlow, MXNET, AWS tools, etc.).
ESRI increasingly recognizing that people work in and outside of ESRI software: R-Bridge, Python API, Jupyter Notebooks makes external linkages super easy.
ESRI is working to support cloud-based storage and computing via:
Support via AWS and Azure; Optimizing raster storage and caching in multiple formats; and the ability to point to existing cloud storage
Plus, for your GPS needs:
Trimble Catalyst antenna + ESRI Collector might be the way to go, but it is windows/android specific for now. iOS compatibility is "on a horizon" as of now.
A quick note about ArcGIS online (ESRI's complete mapping and location intelligence platform). It has 6M subscribers (!), making 1B maps a day (!!). (Did I get those numbers correctly?)
Notes for classes/workshops
GIS-stat-analysis-py-tutor on GitHub
ESRI provides many Learning templates for us who are dreading converting all our ArcMap labs to Pro: https://www.esri.com/training/ and
ESRI is also working on providing templated best practice workflows to help teach concepts. They call them, at least in Image Analyst "Imagery workflows". Might be useful in class/workshops.
As always a great conference! The new ESRI terminology might be a useful organizing structure for class: A GIS is a system of:
- Record: storing spatially indexed information
- Insights: via analysis
- Engagement: through mapping and visualization
We've just wrapped up #DroneCamp2018, hosted at beautiful UC San Diego.
This was an expanded version from last year's model, which we held in Davis. We had 52 participants (from all over the world!) who were keen to learn about drones, data analysis, new technology, and drone futures.
Day 1 was a flight day from half our participants: lots of hands-on with takeoffs and landings, and flying a mission;
Day 2 covered drone safety and regulations, with guest talks from Brandon Stark and Dominique Meyer;
Day 3 covered drone data and analysis;
Day 4 was a flight day for Group 2 and a repeat of Day 1.
We had lots of fun taking pics and tweeting: here is our wrapup on Twitter for #DroneCamp2018.
I’ve been away from the blog for awhile, but thought I’d catch up a bit. I am in beautiful Madison Wisconsin (Lake Mendota! 90 degrees! Rain! Fried cheese curds!) for the NASA LP DAAC User Working Group meeting. This is a cool deal where imagery and product users meet with NASA team leaders to review products and tools. Since this UWG process is new to me, I am highlighting some of the key fun things I learned.
What is a DAAC?
A DAAC is a Distributed Active Archive Center, run by NASA Earth Observing System Data and Information System (EOSDIS). These are discipline-specific facilities located throughout the United States. These institutions are custodians of EOS mission data and ensure that data will be easily accessible to users. Each of the 12 EOSDIS DAACs process, archive, document, and distribute data from NASA's past and current Earth-observing satellites and field measurement programs. For example, if you want to know about snow and ice data, visit the National Snow and Ice Data Center (NSIDC) DAAC. Want to know about social and population data? Visit the Socioeconomic Data and Applications Data Center (SEDAC). These centers of excellence are our taxpayer money at work collecting, storing, and sharing earth systems data that are critical to science, sustainability, economy, and well-being.
What is the LP DAAC?
The Land Processes Distributed Active Archive Center (LP DAAC) is one of several discipline-specific data centers within the NASA Earth Observing System Data and Information System (EOSDIS). The LP DAAC is located at the USGS Earth Resources Observation and Science (EROS) Center in Sioux Falls, South Dakota. LP DAAC promotes interdisciplinary study and understanding of terrestrial phenomena by providing data for mapping, modeling, and monitoring land-surface patterns and processes. To meet this mission, the LP DAAC ingests, processes, distributes, documents, and archives data from land-related sensors and provides the science support, user assistance, and outreach required to foster the understanding and use of these data within the land remote sensing community.
Why am I here?
Each NASA DAAC has established a User Working Group (UWG). There are 18 people on the LP DAAC committee, 12 members from the land remote sensing community at large, like me! Some cool stuff going on. Such as...
Two upcoming launches are super interesting and important to what we are working on. First, GEDI (Global Ecosystem Dynamics Investigation) will produce the first high resolution laser ranging observations of the 3D structure of the Earth. Second, ECOSTRESS (The ECOsystem Spaceborne Thermal Radiometer Experiment on Space Station), will measure the temperature of plants: stressed plants get warmer than plants with sufficient water. ECOSTRESS will use a multispectral thermal infrared radiometer to measure surface temperature. The radiometer will acquire the most detailed temperature images of the surface ever acquired from space and will be able to measure the temperature of an individual farmer's field. Both of these sensors will be deployed on the International Space Station, so data will be in swaths, not continuous global coverage. Also, we got an update from USGS on the USGS/NASA plan for the development and deployment of Landsat 10. Landsat 9 comes 2020, Landsat 10 comes ~2027.
Other Data Projects
We heard from other data providers, and of course we heard from NEON! Remember I posted a series of blogs about the excellent NEON open remote sensing workshop I attended last year. NEON also hosts a ton of important ecological data, and has been thinking through the issues associated with cloud hosting. Tristin Goulden was here to give an overview.
NASA staff gave us a series of demos on their WebGIS services; AppEEARS; and their data website. Their webGIS site uses ArcGIS Enterprise, and serves web image services, web coverage services and web mapping services from the LP DAAC collection. This might provide some key help for us in IGIS and our REC ArcGIS online toolkits. AppEEARS us their way of providing bundles of LP DAAC data to scientists. It is a data extraction and exploration tool. Their LP DAAC data website redesign (website coming soon), which was necessitated by the requirement for a permanent DOI for each data product.
LP DAAC is going full-force in user engagement: they do workshops, collect user testimonials, write great short pieces on “data in action”, work with the press, and generally get the story out about how NASA LP DAAC data is used to do good work. This is a pretty great legacy and they are committed to keep developing it. Lyndsey Harriman highlighted their excellent work here.
Grand Challenges for remote sensing
Some thoughts about our Grand Challenges: 1) Scaling: From drones to satellites. It occurs to me that an integration between the ground-to-airborne data that NEON provides and the satellite data that NASA provides had better happen soon; 2) Data Fusion/Data Assimilation/Data Synthesis, whatever you want to call it. Discovery through datasets meeting for the first time; 3) Training: new users and consumers of geospatial data and remote sensing will need to be trained; 4) Remote Sensible: Making remote sensing data work for society.
A primer on cloud computing
We spent some time on cloud computing. It has been said that cloud computing is just putting your stuff on “someone else’s computer”, but it is also making your stuff “someone else’s problem”, because cloud handles all the painful aspects of serving data: power requirements, buying servers, speccing floor space for your servers, etc. Plus, there are many advantages of cloud computing. Including: Elasticity. Elastic in computing and storage: you can scale up, or scale down or scale sideways. Elastic in terms of money: You pay for only what you use. Speed. Commercial clouds CPUs are faster than ours, and you can use as many as you want. Near real time processing, massive processing, compute intensive analysis, deep learning. Size. You can customize this; you can be fast and expensive or slow and cheap. You use as much as you need. Short-term storage of large interim results or long-term storage of data that you might use one day.
Image courtesy of Chris Lynnes
We can use the cloud as infrastructure, for sharing data and results, and as software (e.g. ArcGIS Online, Google Earth Engine). Above is a cool graphic showing one vision of the cloud as a scaled and optimized workflow that takes advantage of the cloud: from pre-processing, to analytics-optimized data store, to analysis, to visualization. Why this is a better vision: some massive processing engines, such as SPARC or others, require that data be organized in a particular way (e.g. Google Big Table, Parquet, or DataCube). This means we can really crank on processing, especially with giant raster stacks. And at each step in the workflow, end-users (be they machines or people) can interact with the data. Those are the green boxes in the figure above. Super fun discussion, leading to importance of training, and how to do this best. Tristan also mentioned Cyverse, a new NSF project, which they are testing out for their workshops.
Image attribution: Corey Coyle
Super fun couple of days. Plus: Wisconsin is green. And warm. And Lake Mendota is lovely. We were hosted at the University of Wisconsin by Mutlu Ozdogan. The campus is gorgeous! On the banks of Lake Mendota (image attribution: Corey Coyle), the 933-acre (378 ha) main campus is verdant and hilly, with tons of gorgeous 19th-century stone buildings, as well as modern ones. UW was founded when Wisconsin achieved statehood in 1848, UW–Madison is the flagship campus of the UW System. It was the first public university established in Wisconsin and remains the oldest and largest public university in the state. It became a land-grant institution in 1866. UW hosts nearly 45K undergrad and graduate students. It is big! It has a med school and a law school on campus. We were hosted in the UW red-brick Romanesque-style Science Building (opened in 1887). Not only is it the host building for the geography department, it also has the distinction of being the first buildings in the country to be constructed of all masonry and metal materials (wood was used only in window and door frames and for some floors), and may be the only one still extant. How about that! Bye Wisconsin!
Many of us have watched in horror and sadness over the previous week as fires consumed much of the beautiful hills and parts of the towns of Napa and Sonoma Counties. Many of us know people who were evacuated with a few minutes' notice - I met a retired man who left his retirement home with the clothes on his back. Many other friends lost everything - house, car, pets. It was a terrible event - or series of events as there were many active fires. During those 8+ days all of us were glued to our screens searching for up-to-date and reliable information on where the fires were, and how they were spreading. This information came from reputable, reliable sources (such as NASA, or the USFS), from affected residents (from Twitter and other social media), and from businesses (like Planet, ESRI, and Digital Globe who were sometimes creating content and sometimes distilling existing content), and from the media (who were ofen using all of the above). As a spatial data scientist, I am always thinking about mapping, and the ways in which geospatial data and analysis plays an increasingly critical role in disaster notification, monitoring, and response. I am collecting information on the technological landscape of the various websites, media and social media, map products, data and imagery that played a role in announcing and monitoring the #TubbsFire, #SonomaFires and #NapaFires. I think a retrospective of how these tools, and in particular how the citizen science aspect of all of this, helped and hindered society will be useful.
In the literature, the theoretical questions surrounding citizen science or volunteered geography revolve around:
Accuracy – how accurate are these data? How do we evaluate them?
Access – Who has access to the data? Are their technological limits to dissemination?
Bias (sampling issues)/Motivation (who contributes) are critical.
Effectiveness – how effective are the sites? Some scholars have argued that VGI can be inhibiting.
Control - who controls the data, and how and why?
Privacy - Are privacy concerns lessened post disaster?
I think I am most interested in the accuracy and effectiveness questions, but all of them are important. If any of you want to talk more about this or have more resources to discuss, please email me: email@example.com, or Twitter @nmaggikelly.
Summary so far. This will be updated as I get more information.
Outreach from ANR About Fires
ANR has a number of programs dedicated to fire preparedness, recovery, and prevention.
Core Geospatial Technology During Fires
Fire perimeters from https://www.geomac.gov/services.shtml
The Active Fire Perimeters layer is a product of Geospatial Multi-Agency Coordination (GeoMAC). In order to give fire managers near real-time information, fire perimeter data is updated daily based upon input from incident intelligence sources, GPS data, infrared (IR) imagery from fixed wing and satellite platforms.”
MODIS hot spots from USFS Active Fire Mapping Program. MODIS The MODIS instrument is on board NASA's Earth Observing System (EOS) Terra (EOS AM) and Aqua (EOS PM) satellites. In addition to lots of other data, MODIS delivers Channel 31 brightness temperature (in Kelvins) of a hotspot/active fire pixel.
For more on how these are made: http://www.arcgis.com/home/item.html?id=b4ce4179b04f47e4ba79e234205565c1
Media using maps (super short list)
NYTimes: Minutes to Escape: How One California Wildfire Damaged So Much So Quickly. https://www.nytimes.com/interactive/2017/10/21/us/california-fire-damage-map.html?smid=fb-nytimes&smtyp=cur
NYTimes: How California's Most Destructive Wildfire Spread, Hour by Hour. OCT. 21, 2017. https://www.nytimes.com/interactive/2017/10/21/us/california-fire-damage-map.html?smid=fb-nytimes&smtyp=cur
- SFGate focusing on SkyIMD: http://www.sfgate.com/news/article/incredible-aerial-photos-wine-country-fires-12285123.php
Twitter: #sonomafires, #napafires, #tubbsfire
Flickr: SonomaFires, TubbsFires, NapaFires, etc.
Core Technology for Post-Fire Impact
High resolution imagery collection and analysis
Planet has made fire imagery available: https://www.planet.com/pulse/northern-california-wildfire-satellite-data-available-for-access/
Digital Globe + MapBox made a post-fire tool: (Author: @robinkraft
Email: firstname.lastname@example.org, Github repo, Source: Overview News // DigitalGlobe 2017): https://robinkraft.github.io/norcal-fires-imagery/compare.html; https://blog.mapbox.com/santa-rosa-fire-satellite-imagery-a31b6dfefdf8
- Digital Globe - Open Data Program: https://www.digitalglobe.com/opendata
Sonoma Veg Map Program has a few links to interesting stuff including ArcGIS Server Drone and Digital Globe Imagery: http://sonomavegmap.org/blog/2017/10/17/fires/
I know SkyIMD was flying the entire time.
- First Map - Flights 10/11-10/14/17 Here: http://www.skyimd.com/napa-sonoma-fire-imagery-map/
- Second Map - Flight 10/20/17 - Here: http://www.skyimd.com/napa-sonoma-fire-imagery-map-2/
Greg Crutsinger from @droneScholars made this available: Aftermath of #TubbsFire: https://www.mapbox.com/bites/00382/#18/38.4763/-122.74861
Open Aerial Map has one drone image not sure the source https://map.openaerialmap.org/#/-122.7497720718384,38.471986484020334,16/square/023010201213/59e62be93d6412ef7220c4c0?_k=hm0j7l