If you are a Berkeley undergrad and GIS enthusiast interested in learning new GIS skills and contributing to an important project, consider applying for our Fall 2020 SPUR research project. The Kellylab will mentor one SPUR student on a project with Mapping Black California (MBC) to map the proportion of municipal budgets that fund police departments in municipalities across California.
This project will: 1) gather information from websites of several major cities or municipalities in California; 2) visualize results on a map of California; and 3) create a Story Map presentation showcasing the work.
It will be fun, rewarding, and important work!
Read more and apply here by August 31.
My usual update from ESRI UC is a bit tougher this year, since I am working from home, and on one screen. So note taking is a bit rough. And I kind of miss the whole razz-ma-tazz of Day 1 on site. But here goes:
OK, so Jack’s Plenary is the always place to get a big view of new releases in software. 2020 is no different. A sweep of the software improvements coming this summer. Here is my (very) quick summary of highlights. Items with ‘*’ are those that will be useful in class (I hope).
The Conference Theme is Collaboration
What’s coming in ArcGIS Pro and AGOL:
Better integration with OSM*
Visualization (New Map Apps* - launching this fall):
Beta now in AGOL
Better styling, better color ramps, and better dynamic interaction with color ramp *
Dot density mapping!
Clustering and new labeling
Some cool color blending as an alternative to transparency! *
Cartography in Pro:
Optimize for mobile
Spatial Analytics and Data Science
New suitability modeling tool *
AI, Big Data, ML
Jupyter Notebooks inside of ArcGIS Pro *
AGOL implementing Jupyter Notebooks
Imagery and Remote Sensing
Image management - ready to use workflows and content
Analytics - classification, etc.
Something called “Excalibur” - web-based exploitation. Search and find, feature extraction, add to a database
Drone Mapping *
Drone2Map on desktop
Site Scan - cloud-based solutions
Jack loves voxels
Cloud-based sensor data storage and management
Improving editing: 2D and 3D editing improvements *
Field Maps App
In beta, and should streamline things.
And Enterprise runs on kubernetes…
All leading up to ArcGIS 2021 next year.
OK deep breath, off for a lunch break.
- Author: Andy Lyons
The Decision to Go Online. When COVID-19 struck, the DroneCamp Planning Committee had to quickly decide how to respond when it became apparent an person-event would not be possible. By the time we officially canceled the in-person event in early April, registration had already started. We had a couple weeks to decide - do we cancel? Postpone? Hold it virtually?
Given the amount of planning that had already been invested (we started meeting in October 2019), and the fact that only one component (flight practice) would not translate at all to a virtual format, the Committee decided to move forward with an online event. We were admittedly intrigued by the possibilities of an online format, including a potentially larger audience. I was equally excited about the potential for an expanded pool of instructors, as we had been trying for years to get colleagues from distant campuses to participate. We were torn whether to maintain the same 3-day time frame, or spread it out over several weeks as some virtual conferences are doing. We ultimately decided to maintain the same one-week time frame, in part because instructors had already committed to those days. To mitigate Zoom fatigue, we aimed to limit programming to 2-3 hours each morning and 2 hours afternoon. Sessions would be scheduled on the early side to accommodate potential participants on the east coast.
A Need for New Coordination Technology. After committing ourselves to an online event, the details and requirements for what we needed started to hit me. To manage such an ambitious program, we needed a way to coordinate workshop info, software instructions, and exercise materials involving nearly two dozen instructors, and share everything in a clear and organized manner with a couple hundred participants (which in the end grew to over 300). We also needed a way to manage registration for multiple Zoom sessions, including Webinars for the plenaries and concurrent Meetings for the hands-on software workshops.
Having been thru this rodeo before, I knew two lessons for a successful conference. First, the key to a successful collaborative event is not just catering to the needs of participants, but equally if not more importantly your presenters. Second, almost everyone - instructors and participants alike - waits until the 11th hour to pay attention. Hence whatever tool, or set of tools, we used, they had to be user-friendly and as automated as possible so that last minute changes could be accommodated with a minimum of manual steps.
Specifically, I needed to create three information tools:
- An Online Program, consisting of a mix of content from the Committee as well as instructors, that can be be updated quickly, and capable of sharing including a variety of links for slides, data, Zoom, etc.
- A way for people to sign-up for concurrent workshops, and get individualized Zoom links after cross-checking their registration status.
- A customized summary of participants generated from registration data from the ANR survey platform
Ideally, we would have had a Learning Management System to use, or at least an event planning platform to build upon. That was not our case, so I had to turn to my go-to tool stack of DIY solutions: Google App Scripts, R, markdown, and Github.
The Online Program
An online event needs an online program. DroneCamp in particular needed a program that could be quickly updated before and during the event, was user-friendly for instructors to update their session info, and combined a range of both static and dynamic content including session titles and descriptions, thumbnail images, scheduling info, and links for slides, exercises, datasets, and Zoom.
To make life as easy-as-possible for busy instructors, I pre-populated as much of each session info as I could, including draft session titles and descriptions. Knowing that session description would need links to software installation instructions and other details, I created the equivalent of auto-text fields that when rendered by my R script were replaced with values stored on a different tab in the Google Sheet.
Knowing that GitHub occasionally goes down, or could theoretically throttle our site if there was too much traffic, I created a backup site on an Ubuntu web server that IGIS manages. Fortunately we never needed to use it, but it was good to have a Plan B.
Page views are just tip of the iceberg from Google Analytics. I don't have a huge amount of experience with Google Analytics, but hope to dig into this some more for next time to see what else I can glean about our users. Another area for future development is implementing click tracking, which Google Analytics supports, to learn more about how people are using the site.
Creating a Zoom Registration Dashboard
Probably our biggest headache was managing the multiple Zoom sessions. Our requirements were more complex than your average webinar as we need to:
- Manage a combination of Zoom Webinars (for plenary talks), and Zoom Meetings (for the hands-on software workshops). To complicate things further, we only had access to 1 webinar license (which we couldn't use the entire time because of other commitments), so had to ask colleagues to schedule webinars for us. We also needed up to 5 concurrent Zoom meetings for demos, meaning 5 different hosts.
- Make use of Zoom's registration features, to ensure only registered participants could join and prevent Zoombombers
- Provide a mechanism for participants to sign-up for concurrent workshops, and receive individualized Zoom links for the appropriate session after cross-checking their registration status
- Minimize additional registration steps participants had to complete, including registering for multiple Zoom meetings.
Biggest Challenge: Managing Zoom Meeting Registration. In the 3 months leading up to DroneCamp, we learned a lot about Zoom's dizzying array of dials and levers for meetings, webinars, user accounts, and the Zoom client. My biggest challenge was how to register 300+ people for roughly 8 concurrent Zoom meetings, preferably without asking them to click on individual Zoom registration links. This was a fairly big hurdle because unlike Zoom webinars, the Zoom website does not provide a way to upload a CSV file of registrants for a Meeting. Additionally, I needed a solution that would cross-check our DroneCamp registration list, and send individualized connection links. All of this had to be done as efficiently as possible to cater for the dozens I knew would not read any of our emails until after the opening session, and accommodate people who would inevitably change their mind after the course started.
Strike 2: Third Party API Integration Services. Next, I turned to the Zoom API, beginning with the dozens of third party integration services (e.g., automate.io, integromat, appypie.com, hull.io, tray.io). These platforms serve as the connective tissue for all manner of online services and web apps, because they help you integrate your various web APIs (like Google Sheets and Zoom). These platforms are not free, but they're easy to set up and use so if I found one that would get the job done it would be a small price to pay. Reading through the details however, I discovered they mostly supported Zoom webinars, not meetings, and few if any of them supported registration management. Another dead end.
But the authentication step is daunting. The simplest way to authenticate a custom Zoom App is creating a Java Web Token (JWT), which is the equivalent of sending a password. But that option was not available to me because our organizational ANR Zoom account is locked down for security reasons and I'm not an administrator. Thus I had to go with a user authentication approach using an OAuth protocol, which requires programming a complex ‘dance' between login page, redirects, IP white lists, and secret client keys and tokens under the hood. One would think there would be ample examples of authenticating Zoom's enormously popular API from Google's enormously popular Google Apps platform, but such is not the case. After a couple of late nights of stringing together various chunks of sample code, and playing whack-a-mole with cryptic error codes and configuration settings, I finally got it to work. I knew I could now create custom menus in Google Sheets to schedule new Zoom meetings, define their properties, and register people for them.
The next step was to integrate this new capability into a Google Sheets app for DroneCamp. The Google Sheet serves as the 'hub' for several Google Forms, our registration data, and Zoom. It uses conditional formatting, a whole bunch of VLOOKUP and IMPORTRANGE functions, and a Custom menu with a sidebar GUI to manage the Zoom functions. I called the result the Zoom Dashboard.
Each column of the Zoom Dashboard represents a Zoom meeting, and each row represents a participant. When a DroneCamp participant submits the Concurrent Session Sign-up form, a script runs putting a ‘1' in the corresponding column of the Zoom dashboard. The Zoom dashboard also had a copy of the ANR Registration Survey, and a combination of VLOOKUP functions and conditional formatting rules to flag people who did not have a matching registration. As Administrator, I could quickly check the registration status of people who requested a particular workshop, change the ‘1' to another code if needed, and then when everything was good run a menu command to actually register people for the meeting.
The key to creating an interactive Google Sheets App is using sidebars to present the user with a GUI. I had never programmed sidebars before but have seen them used a lot including many Google Sheets Addons. After another night of reading documentation and programming, I eventually figured it out. Behind the scenes, a sidebar is simply an HTML template with form elements that are configured to execute Google Apps functions. This gives you enormous flexibility both in terms of layout as well as functionality, pretty cool.
The system worked fairly well, and the vast majority of participants got their Zoom links quickly and in ample time. Last minute changes in workshop preferences were dealt with relative ease. It took less than a minute to review new workshop requests, validate them, and run the registration script. A small number of participants had recurring hiccups due primarily to problems on their end including, i) registering with multiple email address despite being told repeatedly not to do so, ii) making typos in their email addresses (shockingly common), or iii) failing to take the recommended steps to prevent Zoom confirmation emails getting flagged as spam. Our phenomenal PSU staff were the frontline dealing with questions about Zoom; but I intercepted many of these issues by first flagging problematic entries with conditional formatting rules on the Google Sheet, and then investigating them on a case-by-case basis.
Other features for the Zoom Dashboard that I ran out of time to implement included writing a script that would extract a participant's individual Zoom links for multiple meetings and send them in a single email. That would be extremely useful for an event like ours with so many sessions. I also had plans to write an auto-reminder email for Zoom meetings, another feature that Zoom only provides for Webinar licenses but can be done via the API.
Summary of Participants
We used the in-house ANR Survey System for registration and payment. Similar to platforms like Eventbrite, the ANR Survey Tool provides a highly customizable registration form linked to multiple payment options (including inter-campus transfers). Unlike Eventbrite however, this older tool doesn't have a built-in report generator, nor an API providing access to the underlying data. Instead, you manage registrations through a web interface, which works well, which includes a download option to get the results as an Excel file.
We had no idea how many people would sign-up, so I wanted to create a HTML summary of registered participants for our instructors and planning team to stay abreast of registrations as they trickled in. I knew we'd get people from all over, so I wanted the summary to have an interactive map showing participants' locations, as well as various summaries about their organizations and titles. I also knew that I would have to update this summary on almost a daily basis, so I needed the summary to be generated as automated as possible. There was no getting around having to manually download an Excel file, but my goal for everything after that was to do it "at the click of a button".
Geocoding. We didn't ask people on the registration form to enter their longitude–latitude coordinates, which meant I had to invoke a geocoding engine. However, I didn't want to geocode a participant more than once, both because it costs ‘credits' and takes a couple of seconds. I also needed a way to ‘help' the geocoding engine when the participants location info was incomplete (e.g., the name of a university, but not the city or zip code).
A small but significant number of cases could not be handled by either of the above approaches. Either the registrant omitted multiple locations fields, or my code didn't parse an international address correctly. These cases were flagged and saved in yet another CSV file. After each render, my script reported the number of unsuccessful location matches, telling me I needed to go look at the CSV file and if possible enter an ad-hoc geocoding string based on other information in the record. This worked in nearly all cases.
Generating the tabular summaries of registrants by domain type and organization was relatively straightforward using Regex expressions to parse those fields. The ‘title' field was very eclectic, as you'd expect, nevertheless some interesting patterns started to emerge that I wanted to capture. Manually putting titles into different categories is normally what I'd do, but registration was happening too fast so I needed a more automated solution. I decided to use a word cloud. I had never generated word clouds in R before, but after a few hours of reading forum posts and playing with packages, I settled on the wordcloud2 package with text cleaning functions from the tm (text mining) package. A nice thing about wordcloud2 is that the output is interactive (hover over a word to the effect), and every time you refresh the page it redraws the word cloud with a different color palette. Thank you to the amazing community of open source programmers who over many years have developed word cloud, leaflet, and all the other amazing open source plugins for the web!
Reflections and Lessons Learned
My homemade event technology solutions for DroneCamp using open-source software performed extremely well, with surprisingly few problems. In the process I learned a tremendous amount about integrating Google Apps, R, and Zoom that I will be able to draw upon for future work. Cracking the OAuth authentication dance and Google Apps sidebars were perhaps the biggest breakthroughs, which now gives me a way to automate tasks involving any of the dozens of web API services out there, including Google Apps, Zoom, DropBox, Slack, and many, many others.
I'm very cognizant of the fact that I'm the beneficiary of an enormously well-developed ecosystem of open-source tools. The ability to import and munge data with just a few lines of code, and present it in compelling HTML based widgets like leaflet and word clouds, is simply remarkable. I and everyone who benefits from my work stand on the shoulders of the legions of open source developers who have toiled for years and years to build and share these remarkable tools. Next time you encounter a popup window inviting you to send a small donation to support an open-source platform or buy a coffee for a developer, please consider this!
There are also a few things I would do differently next time, or advice I would give those with similar requirements. Some of the main ones include:
The value of a LMS. Many of the tools I was forced to build from scratch are features you would find in a Learning Management System (LMS). Perhaps the most valuable feature that would have been really nice to have is a one-stop-shop dashboard for participants. Most LMS's require users to create accounts, which by itself would have resolved a host of challenges regarding multiple emails, cross-checking registration status, distributing Zoom links to only those who need them, etc. LMS's also facilitate sharing course materials like slides and exercises, and give you more options to manage access.
Zoom User Accounts. If we were going to use Zoom again, I would definitely turn on user authentication. This simply means participants need individual Zoom accounts (free is fine), and use the Zoom app to join meetings and webinars. User authentication simplifies managing Zoom links, because upcoming Webinars and Meetings appear in the app. It also simplifies sharing recordings, and gives you better analytics. We initially discussed user authentication but shied away from it so as to not put further demands on participants. That was well-intentioned, but in hindsight there would have been more advantages than disadvantages for both participants and organizers.
Alternative Registration Systems. Linking Zoom meeting to a payment system doesn't just incur credit card fees, it also comes with workflow costs. Either you develop ad-hoc tools and protocols (as we did) to cross-check Zoom and event registration, or you adopt a 3rd party registration system like Evenbrite that integrates both payment and Zoom registration. All of these options come with tradeoffs that need to be incorporated into planning and support efforts.
Simpler, smaller, and/or free trainings can be conducted quite satisfactorily using just Zoom Meetings, but it helps to have a good understanding of Zoom's myriad of settings. Zoom's built-in registration system is actually quite configurable, something we didn't take advantage of. You can add your own questions and branding elements to the registration form. For trainings that only involve one or two meetings, that might suffice./h1>/h1>/h1>/h2>
- Author: Shane Feirer
Normally at this time of Year, I am getting ready to travel to San Diego for the ESRI User Conference. At the user conference 20,000 people from all over the world gather to hear about what new GIS tools and functionality ESRI is building into their GIS products. The participants also attend hundreds of technical sessions and workshops to improve their GIS Skills and they also network and discuss how they are using GIS in their fields of interests.
This year with COVID-19 the in-person user conference is not occurring, and the User Conference is going to be presented as a virtual conference (see agenda) with plenary sessions, technical sessions, and technical support. This is unfortunate, but it provides for a broader community to attend the plenary and learn about how GIS is currently being used and what GIS can be used for.
The plenary session is always an eye-opener! It typically has examples of impactful ways in which GIS is changing the world - examples from education, environment, planning, health, and so much more! It is really a great experience, and I recommend it.
If you are curious about what GIS can do or to get ideas about how else GIS could be used in our Organization please register and attend the plenary sessions of the ESRI Virtual User Conference. If you want to talk about GIS or about what you saw at the Virtual User Conference, please email me at email@example.com.
‘Get access to the Plenary Session livestream
Watch powerful stories about how GIS is making a difference in the world. See demonstrations of Esri technology and learn about the newest upgrades. Hear a keynote from Jack Dangermond and presentations from other thought-provoking speakers.
Plenary Session access is complimentary for everyone'
Registration Now for the Plenary:
During this coronavirus lock down, IGIS has set out to revamp its data infrastructure to address our growing needs for big data storage and management moving forward. In particular, over the past few years we have accumulated over a dozen terabytes of drone data and associated mapping products, constituting tens of thousands of project files, and the quantity of this data is only expected to keep growing. Typically this drone data has been processed on a number of local desktop computers, and then backed up onto RAID hard drives or the cloud for cold storage; however, this is far from ideal in terms of consistent organization, versioning and ease of distribution.
As a solution to the problem, IGIS purchased a web server, equipped with multiple virtual machines (for processing, analysis and web services) along with a 30TB RAID data store/repository. The repository was networked to our various IGIS computers and RAID storage devices, so that all of our drone data could be transferred over to it. After much consideration, we settled on a standardized file structure, which could accommodate both datasets from past and future drone projects, with room for growth as needed. A python script was written to automatically generate this file structure, with some metadata inputs for each project, and our previous projects' data were then moved into their appropriate slots in the new structure, while jettisoning unwanted intermediary processing files; freeing up a ton of storage space. It would be correct to assume that this process of moving data was quite time consuming. However, moving forward, it will be easy to automatically set up our projects' file structures right from the inception of every new project, beginning with running the python script in ArcGIS Pro's Jupyter Notebook utility in the field, to eventually be delivered to the server repository down the pipeline, in a nicely organized package (similar to what we would provide to our non-IGIS project collaborators).
That alone is a big step in the right direction, but it gets better. Because all of this data is now in a standardized file structure, with standardized folder naming conventions, scripting our ArcGIS portal to automatically connect with the data via the imager server was only a small step away. With this complete, now any IGIS team member can access our entire post-processed, GIS-ready, drone data inventory of layers via ArcGIS Online or ArcGIS Pro.
Ultimately this has been a big leap forward, in terms of IGIS's informatics infrastructure; to compliment our significantly evolved pipeline for drone data collection and processing, depicted below.