Day 3: I opened the day with a lovely swim with Elizabeth Havice (in the largest pool in New England? Boston? The Sheraton?) and then embarked on a multi-mile walk around the fair city of Boston. The sun was out and the wind was up, showing the historical buildings and waterfront to great advantage. The 10-year old Institute of Contemporary Art was showing in a constrained space, but it did host an incredibly moving video installation from Steve McQueen (Director of 12 Years a Slave) called “Ashes” about the life and death of a young fisherman in Grenada.
My final AAG attendance involved two plenaries hosted by the Remote Sensing Specialty Group and the GIS Specialty Group, who in their wisdom, decided to host plenaries by two absolute legends in our field – Art Getis and John Jensen – at the same time. #battleofthetitans. #gisvsremotesensing. So, I tried to get what I could from both talks. I started with the Waldo Tobler Lecture given by Art Getis: The Big Data Trap: GIS and Spatial Analysis. Compelling title! His perspective as a spatial statistician on the big data phenomena is a useful one. He talks about how data are growing fast: Every minute – 98K tweets; 700K FB updates; 700K Google searches; 168+M emails sent; 1,820 TB of data created. Big data is growing in spatial work; new analytical tools are being developed, data sets are generated, and repositories are growing and becoming more numerous. But, there is a trap. And here is it. The trap of Big Data:
10 Erroneous assumptions to be wary of:
- More data are better
- Correlation = causation
- Gotta get on the bandwagon
- I have an impeccable source
- I have really good software
- I am good a creating clever illustrations
- I have taken requisite spatial data analysis courses
- It’s the scientific future
- Accessibly makes it ethical
- There is no need to sample
He then asked: what is the role of spatial scientists in the big data revolution? He says our role is to find relationships in a spatial setting; to develop technologies or methods; to create models and use simulation experiments; to develop hypotheses; to develop visualizations and to connect theory to process.
The summary from his talk is this: Start with a question; Differentiate excitement from usefulness; Appropriate scale is mandatory; and Remember more may or may not be better.
When Dr Getis finished I made a quick run down the hall to hear the end of the living legend John Jensen’s talk on drones. This man literally wrote the book on remote sensing, and he is the consummate teacher – always eager to teach and extend his excitement to a crowded room of learners. His talk was entitled Personal and Commercial Unmanned Aerial Systems (UAS) Remote Sensing and their Significance for Geographic Research. He presented a practicum about UAV hardware, software, cameras, applications, and regulations. His excitement about the subject was obvious, and at parts of his talk he did a call and response with the crowd. I came in as he was beginning his discussion on cameras, and he also discussed practical experience with flight planning, data capture, and highlighted the importance of obstacle avoidance and videography in the future. Interestingly, he has added movement to his “elements of image interpretation”. Neat. He says drones are going to be routinely part of everyday geographic field research.
What a great conference, and I feel honored to have been part of it.
Day 1: Wednesday I focused on the organized sessions on uncertainty and context in geographical data and analysis. I’ve found AAGs to be more rewarding if you focus on a theme, rather than jump from session to session. But less steps on the iWatch of course. There are nearly 30 (!) sessions of speakers who were presenting on these topics throughout the conference.
An excellent plenary session on New Developments and Perspectives on Context and Uncertainty started us off, with Mei Po Kwan and Michael Goodchild providing overviews. We need to create reliable geographical knowledge in the face of the challenges brought up by uncertainty and context, for example: people and animals move through space, phenomena are multi-scaled in space and time, data is heterogeneous, making our creation of knowledge difficult. There were sessions focusing on sampling, modeling, & patterns, on remote sensing (mine), on planning and sea level rise, on health research, on urban context and mobility, and on big data, data context, data fusion, and visualization of uncertainty. What a day! All of this is necessarily interdisciplinary. Here are some quick insights from the keynotes.
Mei Po Kwan focused on uncertainty and context in space and time:
- We all know about the MAUP concept, what about the parallel with time? The MTUP: modifiable temporal unit problem.
- Time is very complex. There are many aspects of time: momentary, time-lagged response, episodic, duration, cumulative exposure
- How do we aggregate, segment and bound spatial-temporal data in order to understand process?
- The basic message is that you must really understand uncertainty: Neighborhood effects can be overestimated if you don’t include uncertainty.
As expected, Michael Goodchild gave a master class in context and uncertainty. No one else can deliver such complex material so clearly, with a mix of theory and common sense. Inspiring. Anyway, he talked about:
- Data are a source of context:
- Vertical context – other things that are known about a location, that might predict what happens and help us understand the location;
- Horizontal context – things about neighborhoods that might help us understand what is going on.
- B oth of these aspects have associated uncertainties, which complicate analyses.
- Why is geospatial data uncertain?
- Location measurement is uncertain
- Any integration of location is also uncertain
- Observations are non-replicable
- Loss of spatial detail
- Conceptual uncertainty
- This is the paradox. We have abundant sources of spatial data, they are potentially useful. Yet all of them are subject to myriad types of uncertainty. And, the conceptual definition of context is fraught with uncertainty.
- He then talked about some tools for dealing with uncertainty, such as areal interpolation, and spatial convolution.
- He finished with some research directions, including focusing on behavior and pattern, better ways of addressing confidentiality, and development of a better suite of tools that include uncertainty?
My session went well – I got a great question from Mark Fonstad about the real independence of errors – as in canopy height and canopy base height are likely correlated, so aren’t their errors? Why do you treat them as independent? Which kind of blew my tiny mind, but Qinghua stepped in with some helpful words about the difficulties of sampling from a joint probability distribution in Monte Carlo simulations, etc.
Plus we had some great times with Jacob, Leo, Yanjun and the Green Valley International crew who were showcasing their series of Lidar instruments and software. Good times for all!
What a full week! Here is my wrap-up from a great 2016 ESRI User Conference.
I haven't been here in many years, and I am glad I came. I learned much, and have some new ideas for workshops and classes, and how IGIS can be more of service to ANR, and I got started on ArcGIS Pro - ESRI's eventual replacement for ArcGIS Desktop. Pro has a completely new user interface that is very clear; you can visualize, edit, and perform analysis in both 2D and 3D; it is super fast via multithreading & 64-bit processing (finally), and it has new icons and a bunch of new processing tools. A bunch of very cool stuff comes with 1.3 soon.
Day 1: Monday was spent in big-picture, inspirational talks from camera beautiful GIS people. 15,000 in the audience, 10 massive screens. I loved it, I felt like I was at a really intense but sedate rock concert. Note to ESRI: could you put the chairs any closer together? The highlight was speaker keynote Andrea Wulf, talking about her obsession with Alexander Von Humboldt. Note to self: get the book. In addition to the big picture stuff, Day 1 is ESRI's chance to highlight this year's software improvements as we continue the voyage away from the desktop workflow: Pro, integrated 3D, green design, Apps, seamless integration with the web.
Day 2: Tuesday focused on workshops. I picked four workshops from the Spatial Statistics Team at ESRI. These were led by Lauren Bennett and her crew (Flora Vale, Jenora D'Acosta). Uniformly fantastic. I had downloaded Pro the night before, and with some trepidation got started and followed along. I am happy to report that it seems very intuitive. I have read elsewhere about worries that there is loss of cartographic control, and I will look into that. I learned about the Spatial Stats toolbox in Pro, and some very new capacity in optimization of pattern analysis (you know how difficult it is to pick that distance kernel), and in the new space-time cube capabilities. The space-time capabilities make very complex analyses doable, and are very exciting, but still a bit constraining if you don't know how to update the code. Oh yeah, and space-time uses netCDF format.
Day 3: For Wednesday's workshops I chose workshops that would help me update class labs: Python + Raster analysis; Site suitability in Pro; Cost Connectivity in Pro; and some crazy cool prediction tool called Empirical Baysien Kriging, which I will be checking out. I guess EBK has been around for awhile, but now implemented in ESRI software. The new suite of tools in site suitability + connectivity are going to be key. Kevin M. Johnston and Elizabeth Graham led the Site Suitability and Connectivity, and Eric Krause led the Kriging workshop.
Day 4: All day was spent in Pix4D immersion with the excellent Pix4D support/training team. Pix4D is the gold standard for drone imagery workflow; it also serves as the backend engine for ESRI's Drone2Map application, which I have not tried. Most of the morning was spent in basics: workflow basics, application domains, super wow factor examples like 0.5cm resolution imagery processing. We also looked at workflow and best practices, accuracy, and some example projects. The room was full of 50+ people, many with specific questions about a range of projects. Got to hang out a bit with Greg Crustinger, who is now at Parrot. Even more excited now about our new Sequoia cameras.
- Little Italy has some great restaurants.
- We need to move to Pro soon. Probably not in time for Fall's class, but soon. GIF and IGIS workshops are going to have to get updated.
- I need to get more in touch with imagery analysis in Pro. Especially with the segmentation and classification part.
- I want to recreate the workflow for site suitability + locate regions + cost connectivity soon.
- The ability to perfom complex analyses in a GUI is increasingly easy, but is that a good thing? We have to be increasingly vigilant about training the fundamentals as well.
- One frustration about these workshops that I bet my workshop participants share - the data all is perfect, and ready to go. We need to keep talking about where to get data, and how to wrangle it into shape.
- Could drones mean the resurrection of photogrammetry? At least in the classroom?
- Hyper granularity: how do we processes these increasingly fine resolution datasets?
- Global to local focus in modeling: GWR, optimized Getis-Ord, empirical baysien kriging all try to deal with and model local variability across a complex study area;
- Incorporation of permutations and distribution functions in modeling has been made way easier;
- Big Data, multidimensional data, temporal data: ESRI is really trying to be a better informatics platform for research;
- ESRI seems to be increasingly relying on external and open standards for new data formats/products; this is a great trend;
- Decision-making: all these analyses need to support decision-making; communication remains critical, tools for web-based interaction continue to expand.
Last week we held another bootcamp on Spatial Data Science. We had three packed days learning about the concepts, tools and workflow associated with spatial databases, analysis and visualizations. Our goal was not to teach a specific suite of tools but rather to teach participants how to develop and refine repeatable and testable workflows for spatial data using common standard programming practices.
On Day 1 we focused on setting up a collaborative virtual data environment through virtual machines, spatial databases (PostgreSQL/PostGIS) with multi-user editing and versioning (GeoGig). We also talked about open data and open standards, and moderndata formats and tools (GeoJSON, GDAL). On Day 2 we focused on open analytical tools for spatial data. We focused on Python (i.e. PySAL, NumPy, PyCharm, iPython Notebook), and R tools. Day 3 was dedicated to the web stack, and visualization via ESRI Online, CartoDB, and Leaflet. Web mapping is great, and as OpenGeo.org says: “Internet maps appear magical: portals into infinitely large, infinitely deep pools of data. But they aren't magical, they are built of a few standard pieces of technology, and the pieces can be re-arranged and sourced from different places.…Anyone can build an internet map."
All-in-all it was a great time spent with a collection of very interesting mapping professionals from around the country. Thanks to everyone!/span>
The annual AAG conference is rolling into town next week, and several of us will be there.
- Kelly and Jenny will be presenting;
- Kelly: Disentangling drivers of change in California Forests: management and climate
- Jenny: Spatial Data Science for Collaborative Geospatial Research
- Alice is a discussant on THREE panels; and
- I am a discussant on the Historical Ecology session.
Former kellylabbers will also be in force:
- John Connors is presenting (and organizing, and morderating, and all kinds of things):
- Disentangling Diversity: Agrobiodiversity, Livelihoods, and Food Security in the Kilombero Valley, Tanzania
- Desheng Liu will be there:
- Reconstructing Land Cover Trajectories from Dense MODIS Time Series
Have a great time everyone! (If I have missed anyone, let me know!)