- Author: Hannah Lopez
- Author: Andy Lyons
2020: DroneCamp Goes Virtual
In June 2020, IGIS hosted the 4th annual DroneCamp in collaboration with CSU Monterey Bay, the UC Unmanned Aircraft System Safety office, and Monterey Bay DART. The four-day intensive bootcamp aims to cover the full spectrum of skills and tools needed to collect data and make maps with drones.
Due to COVID19, DroneCamp 2020 was held entirely online. We couldn't teach flight practice over Zoom, but other than that the curriculum covered the usual suite of topics including regulations and safety, equipment, flight planning, photogrammetry, and data analysis.
The online format wasn't nearly as fun as meeting in person at CSU Monterey Bay, but it produced some unexpected benefits. Over 330 people attended the program from all over the US, Canada and several other countries. Attendees came from a wide range of backgrounds including education, natural resource management, resource conservation, agriculture, real estate, construction, and archeology.
The pool of presenters and instructors also diversified both geographically and programmatically. Instructors came from several universities across California and beyond, and an expanded set of research talks covered a range of topics including workforce development, coastal and marine systems, and digital agriculture.
Session Recordings Now on YouTube!
You can access the sessions via the DroneCamp Online Program, or the DroneCamp 2020 Playlist on the IGIS YouTube Channel. The Online Program also contain links to presentation slides, exercise handouts, and datasets.
Organizing and editing all the videos for production was no small feat, and the DroneCamp team is excited to make these resource materials accessible to everyone who would like to learn more about the world of UAV technology. If you find them useful, or would like to get involved in future training on drone mapping, please leave a comment or send us a note.
IGIS would like to extend their gratitude to the many collaborators who made this years online conference possible: UC ANR Program Support Unit; UC ANR Integrated Pest Management program, UC ANR Learning & Development unit; Wilder Interpreting Services; Monterey Bay DART; the FAA Collegiate Training Initiative; and, of course, all our instructors, presenters, and participants. Thanks!
- Author: Andy Lyons
The Decision to Go Online. When COVID-19 struck, the DroneCamp Planning Committee had to quickly decide how to respond when it became apparent an person-event would not be possible. By the time we officially canceled the in-person event in early April, registration had already started. We had a couple weeks to decide - do we cancel? Postpone? Hold it virtually?
Given the amount of planning that had already been invested (we started meeting in October 2019), and the fact that only one component (flight practice) would not translate at all to a virtual format, the Committee decided to move forward with an online event. We were admittedly intrigued by the possibilities of an online format, including a potentially larger audience. I was equally excited about the potential for an expanded pool of instructors, as we had been trying for years to get colleagues from distant campuses to participate. We were torn whether to maintain the same 3-day time frame, or spread it out over several weeks as some virtual conferences are doing. We ultimately decided to maintain the same one-week time frame, in part because instructors had already committed to those days. To mitigate Zoom fatigue, we aimed to limit programming to 2-3 hours each morning and 2 hours afternoon. Sessions would be scheduled on the early side to accommodate potential participants on the east coast.
A Need for New Coordination Technology. After committing ourselves to an online event, the details and requirements for what we needed started to hit me. To manage such an ambitious program, we needed a way to coordinate workshop info, software instructions, and exercise materials involving nearly two dozen instructors, and share everything in a clear and organized manner with a couple hundred participants (which in the end grew to over 300). We also needed a way to manage registration for multiple Zoom sessions, including Webinars for the plenaries and concurrent Meetings for the hands-on software workshops.
Having been thru this rodeo before, I knew two lessons for a successful conference. First, the key to a successful collaborative event is not just catering to the needs of participants, but equally if not more importantly your presenters. Second, almost everyone - instructors and participants alike - waits until the 11th hour to pay attention. Hence whatever tool, or set of tools, we used, they had to be user-friendly and as automated as possible so that last minute changes could be accommodated with a minimum of manual steps.
Specifically, I needed to create three information tools:
- An Online Program, consisting of a mix of content from the Committee as well as instructors, that can be be updated quickly, and capable of sharing including a variety of links for slides, data, Zoom, etc.
- A way for people to sign-up for concurrent workshops, and get individualized Zoom links after cross-checking their registration status.
- A customized summary of participants generated from registration data from the ANR survey platform
Ideally, we would have had a Learning Management System to use, or at least an event planning platform to build upon. That was not our case, so I had to turn to my go-to tool stack of DIY solutions: Google App Scripts, R, markdown, and Github.
The Online Program
An online event needs an online program. DroneCamp in particular needed a program that could be quickly updated before and during the event, was user-friendly for instructors to update their session info, and combined a range of both static and dynamic content including session titles and descriptions, thumbnail images, scheduling info, and links for slides, exercises, datasets, and Zoom.
To make life as easy-as-possible for busy instructors, I pre-populated as much of each session info as I could, including draft session titles and descriptions. Knowing that session description would need links to software installation instructions and other details, I created the equivalent of auto-text fields that when rendered by my R script were replaced with values stored on a different tab in the Google Sheet.
Knowing that GitHub occasionally goes down, or could theoretically throttle our site if there was too much traffic, I created a backup site on an Ubuntu web server that IGIS manages. Fortunately we never needed to use it, but it was good to have a Plan B.
Page views are just tip of the iceberg from Google Analytics. I don't have a huge amount of experience with Google Analytics, but hope to dig into this some more for next time to see what else I can glean about our users. Another area for future development is implementing click tracking, which Google Analytics supports, to learn more about how people are using the site.
Creating a Zoom Registration Dashboard
Probably our biggest headache was managing the multiple Zoom sessions. Our requirements were more complex than your average webinar as we need to:
- Manage a combination of Zoom Webinars (for plenary talks), and Zoom Meetings (for the hands-on software workshops). To complicate things further, we only had access to 1 webinar license (which we couldn't use the entire time because of other commitments), so had to ask colleagues to schedule webinars for us. We also needed up to 5 concurrent Zoom meetings for demos, meaning 5 different hosts.
- Make use of Zoom's registration features, to ensure only registered participants could join and prevent Zoombombers
- Provide a mechanism for participants to sign-up for concurrent workshops, and receive individualized Zoom links for the appropriate session after cross-checking their registration status
- Minimize additional registration steps participants had to complete, including registering for multiple Zoom meetings.
Biggest Challenge: Managing Zoom Meeting Registration. In the 3 months leading up to DroneCamp, we learned a lot about Zoom's dizzying array of dials and levers for meetings, webinars, user accounts, and the Zoom client. My biggest challenge was how to register 300+ people for roughly 8 concurrent Zoom meetings, preferably without asking them to click on individual Zoom registration links. This was a fairly big hurdle because unlike Zoom webinars, the Zoom website does not provide a way to upload a CSV file of registrants for a Meeting. Additionally, I needed a solution that would cross-check our DroneCamp registration list, and send individualized connection links. All of this had to be done as efficiently as possible to cater for the dozens I knew would not read any of our emails until after the opening session, and accommodate people who would inevitably change their mind after the course started.
Strike 2: Third Party API Integration Services. Next, I turned to the Zoom API, beginning with the dozens of third party integration services (e.g., automate.io, integromat, appypie.com, hull.io, tray.io). These platforms serve as the connective tissue for all manner of online services and web apps, because they help you integrate your various web APIs (like Google Sheets and Zoom). These platforms are not free, but they're easy to set up and use so if I found one that would get the job done it would be a small price to pay. Reading through the details however, I discovered they mostly supported Zoom webinars, not meetings, and few if any of them supported registration management. Another dead end.
But the authentication step is daunting. The simplest way to authenticate a custom Zoom App is creating a Java Web Token (JWT), which is the equivalent of sending a password. But that option was not available to me because our organizational ANR Zoom account is locked down for security reasons and I'm not an administrator. Thus I had to go with a user authentication approach using an OAuth protocol, which requires programming a complex ‘dance' between login page, redirects, IP white lists, and secret client keys and tokens under the hood. One would think there would be ample examples of authenticating Zoom's enormously popular API from Google's enormously popular Google Apps platform, but such is not the case. After a couple of late nights of stringing together various chunks of sample code, and playing whack-a-mole with cryptic error codes and configuration settings, I finally got it to work. I knew I could now create custom menus in Google Sheets to schedule new Zoom meetings, define their properties, and register people for them.
The next step was to integrate this new capability into a Google Sheets app for DroneCamp. The Google Sheet serves as the 'hub' for several Google Forms, our registration data, and Zoom. It uses conditional formatting, a whole bunch of VLOOKUP and IMPORTRANGE functions, and a Custom menu with a sidebar GUI to manage the Zoom functions. I called the result the Zoom Dashboard.
Each column of the Zoom Dashboard represents a Zoom meeting, and each row represents a participant. When a DroneCamp participant submits the Concurrent Session Sign-up form, a script runs putting a ‘1' in the corresponding column of the Zoom dashboard. The Zoom dashboard also had a copy of the ANR Registration Survey, and a combination of VLOOKUP functions and conditional formatting rules to flag people who did not have a matching registration. As Administrator, I could quickly check the registration status of people who requested a particular workshop, change the ‘1' to another code if needed, and then when everything was good run a menu command to actually register people for the meeting.
The key to creating an interactive Google Sheets App is using sidebars to present the user with a GUI. I had never programmed sidebars before but have seen them used a lot including many Google Sheets Addons. After another night of reading documentation and programming, I eventually figured it out. Behind the scenes, a sidebar is simply an HTML template with form elements that are configured to execute Google Apps functions. This gives you enormous flexibility both in terms of layout as well as functionality, pretty cool.
The system worked fairly well, and the vast majority of participants got their Zoom links quickly and in ample time. Last minute changes in workshop preferences were dealt with relative ease. It took less than a minute to review new workshop requests, validate them, and run the registration script. A small number of participants had recurring hiccups due primarily to problems on their end including, i) registering with multiple email address despite being told repeatedly not to do so, ii) making typos in their email addresses (shockingly common), or iii) failing to take the recommended steps to prevent Zoom confirmation emails getting flagged as spam. Our phenomenal PSU staff were the frontline dealing with questions about Zoom; but I intercepted many of these issues by first flagging problematic entries with conditional formatting rules on the Google Sheet, and then investigating them on a case-by-case basis.
Other features for the Zoom Dashboard that I ran out of time to implement included writing a script that would extract a participant's individual Zoom links for multiple meetings and send them in a single email. That would be extremely useful for an event like ours with so many sessions. I also had plans to write an auto-reminder email for Zoom meetings, another feature that Zoom only provides for Webinar licenses but can be done via the API.
Summary of Participants
We used the in-house ANR Survey System for registration and payment. Similar to platforms like Eventbrite, the ANR Survey Tool provides a highly customizable registration form linked to multiple payment options (including inter-campus transfers). Unlike Eventbrite however, this older tool doesn't have a built-in report generator, nor an API providing access to the underlying data. Instead, you manage registrations through a web interface, which works well, which includes a download option to get the results as an Excel file.
We had no idea how many people would sign-up, so I wanted to create a HTML summary of registered participants for our instructors and planning team to stay abreast of registrations as they trickled in. I knew we'd get people from all over, so I wanted the summary to have an interactive map showing participants' locations, as well as various summaries about their organizations and titles. I also knew that I would have to update this summary on almost a daily basis, so I needed the summary to be generated as automated as possible. There was no getting around having to manually download an Excel file, but my goal for everything after that was to do it "at the click of a button".
Geocoding. We didn't ask people on the registration form to enter their longitude–latitude coordinates, which meant I had to invoke a geocoding engine. However, I didn't want to geocode a participant more than once, both because it costs ‘credits' and takes a couple of seconds. I also needed a way to ‘help' the geocoding engine when the participants location info was incomplete (e.g., the name of a university, but not the city or zip code).
A small but significant number of cases could not be handled by either of the above approaches. Either the registrant omitted multiple locations fields, or my code didn't parse an international address correctly. These cases were flagged and saved in yet another CSV file. After each render, my script reported the number of unsuccessful location matches, telling me I needed to go look at the CSV file and if possible enter an ad-hoc geocoding string based on other information in the record. This worked in nearly all cases.
Generating the tabular summaries of registrants by domain type and organization was relatively straightforward using Regex expressions to parse those fields. The ‘title' field was very eclectic, as you'd expect, nevertheless some interesting patterns started to emerge that I wanted to capture. Manually putting titles into different categories is normally what I'd do, but registration was happening too fast so I needed a more automated solution. I decided to use a word cloud. I had never generated word clouds in R before, but after a few hours of reading forum posts and playing with packages, I settled on the wordcloud2 package with text cleaning functions from the tm (text mining) package. A nice thing about wordcloud2 is that the output is interactive (hover over a word to the effect), and every time you refresh the page it redraws the word cloud with a different color palette. Thank you to the amazing community of open source programmers who over many years have developed word cloud, leaflet, and all the other amazing open source plugins for the web!
Reflections and Lessons Learned
My homemade event technology solutions for DroneCamp using open-source software performed extremely well, with surprisingly few problems. In the process I learned a tremendous amount about integrating Google Apps, R, and Zoom that I will be able to draw upon for future work. Cracking the OAuth authentication dance and Google Apps sidebars were perhaps the biggest breakthroughs, which now gives me a way to automate tasks involving any of the dozens of web API services out there, including Google Apps, Zoom, DropBox, Slack, and many, many others.
I'm very cognizant of the fact that I'm the beneficiary of an enormously well-developed ecosystem of open-source tools. The ability to import and munge data with just a few lines of code, and present it in compelling HTML based widgets like leaflet and word clouds, is simply remarkable. I and everyone who benefits from my work stand on the shoulders of the legions of open source developers who have toiled for years and years to build and share these remarkable tools. Next time you encounter a popup window inviting you to send a small donation to support an open-source platform or buy a coffee for a developer, please consider this!
There are also a few things I would do differently next time, or advice I would give those with similar requirements. Some of the main ones include:
The value of a LMS. Many of the tools I was forced to build from scratch are features you would find in a Learning Management System (LMS). Perhaps the most valuable feature that would have been really nice to have is a one-stop-shop dashboard for participants. Most LMS's require users to create accounts, which by itself would have resolved a host of challenges regarding multiple emails, cross-checking registration status, distributing Zoom links to only those who need them, etc. LMS's also facilitate sharing course materials like slides and exercises, and give you more options to manage access.
Zoom User Accounts. If we were going to use Zoom again, I would definitely turn on user authentication. This simply means participants need individual Zoom accounts (free is fine), and use the Zoom app to join meetings and webinars. User authentication simplifies managing Zoom links, because upcoming Webinars and Meetings appear in the app. It also simplifies sharing recordings, and gives you better analytics. We initially discussed user authentication but shied away from it so as to not put further demands on participants. That was well-intentioned, but in hindsight there would have been more advantages than disadvantages for both participants and organizers.
Alternative Registration Systems. Linking Zoom meeting to a payment system doesn't just incur credit card fees, it also comes with workflow costs. Either you develop ad-hoc tools and protocols (as we did) to cross-check Zoom and event registration, or you adopt a 3rd party registration system like Evenbrite that integrates both payment and Zoom registration. All of these options come with tradeoffs that need to be incorporated into planning and support efforts.
Simpler, smaller, and/or free trainings can be conducted quite satisfactorily using just Zoom Meetings, but it helps to have a good understanding of Zoom's myriad of settings. Zoom's built-in registration system is actually quite configurable, something we didn't take advantage of. You can add your own questions and branding elements to the registration form. For trainings that only involve one or two meetings, that might suffice./h1>/h1>/h1>/h2>
- Author: Shane Feirer
Normally at this time of Year, I am getting ready to travel to San Diego for the ESRI User Conference. At the user conference 20,000 people from all over the world gather to hear about what new GIS tools and functionality ESRI is building into their GIS products. The participants also attend hundreds of technical sessions and workshops to improve their GIS Skills and they also network and discuss how they are using GIS in their fields of interests.
This year with COVID-19 the in-person user conference is not occurring, and the User Conference is going to be presented as a virtual conference (see agenda) with plenary sessions, technical sessions, and technical support. This is unfortunate, but it provides for a broader community to attend the plenary and learn about how GIS is currently being used and what GIS can be used for.
The plenary session is always an eye-opener! It typically has examples of impactful ways in which GIS is changing the world - examples from education, environment, planning, health, and so much more! It is really a great experience, and I recommend it.
If you are curious about what GIS can do or to get ideas about how else GIS could be used in our Organization please register and attend the plenary sessions of the ESRI Virtual User Conference. If you want to talk about GIS or about what you saw at the Virtual User Conference, please email me at email@example.com.
‘Get access to the Plenary Session livestream
Watch powerful stories about how GIS is making a difference in the world. See demonstrations of Esri technology and learn about the newest upgrades. Hear a keynote from Jack Dangermond and presentations from other thought-provoking speakers.
Plenary Session access is complimentary for everyone'
Registration Now for the Plenary:
The Problem: An Outdated and Unwieldy Site
The Informatics and GIS (IGIS) Statewide Program was established in 2012 to meet UCANR's growing need for geospatial research and technical training. We created our website using ANR's Site Builder content management system, which was state of the art at the time. Like many programs, our website grew organically over the years, until we woke up one day and discovered we had over 70 subpages!!
The size of our site made it not only unwieldy to navigate, it was also difficult to maintain and keep relevant. You know the drill: multiple subpages that you forgot about; no one updating; a general sprawl of information that only a few knew how to navigate. With feedback from UCANR Strategic Communications, we came to realize that our site was highly focused on our own needs and that of an internal (UCANR) audience. Clearly, we needed to redesign the website with a more external focus, and make it easier to navigate and get information. We did a collective “Sprint” during the COVID-19 shutdown, and revamped the site.
Goals for the New Site
Rather than merely do a superficial makeover, we decided to reorganize our site from the ground-up. We started out thinking about our communication goals, and the needs and interests of our clientele. Some people come looking for a specific resource, such as a software license or Tech Note. Others come looking for info about our upcoming training programs, or to learn more about our GIS and Drone Services. Still others just want to see what we do, and how we fit into ANR's overarching program umbrella.
Our new site is organized around four themes: Research, Services, Training, and Resources. To convey the breadth of what we do, we decided to develop “cards”, or nuggets of information about our work. A common thread in everything we do is connecting-the-dots, so we decided to make heavy use of tags and hyperlinks that connect our work to ANR's Strategic Goals, Public Value Statements, and our collaborators around the state.
Solution: Customization via Site Builder
Results: Dynamic and Focused Content
Custom Section Dividers
To help visitors navigate the home page and find the content they're looking for quickly, we developed attractive page dividers that split the landing page into sections. Under the hood these are simple DIV tags with custom background and color attributes, and are super-easy to make.
Another visual enhancement that makes a web page look less generic is achieved by using custom icons as buttons. We modified some standard clipart images to use as links on the ‘Client Services' toolbar which take users to different parts of our site. For a little branding flourish, we used colors that match the ANR palette, and created an ‘inverse' version of each images which appears when you hover over it. This is all very standard HTML and easy to implement.
Integration with Google Sheets
Over the past couple years, we've migrated the bulk of our program planning and tracking into Google Sheets. Nearly everything we do - workshop offerings, drone flights, service projects, publications, surveys - are recorded in a collection of easy-to-update Google Sheets. For this project, we created a new sheet to hold content specifically for certain parts of our website - including the project cards, metrics about our training and GIS services, and even quotes from our clientele.
Keeping Things Fresh through Randomization
To avoid stale content, we built in some randomization. Every time you come back to the home page, or refresh it, the video loop, project cards, and other content changes. As we add more content on the backend, the selection will be even more varied, making every visit seem new and fresh.
We've been looking for a way to show off some of our drone video for a long time, but the standard embedded YouTube player just wasn't cutting it. After creating a custom space for a video loop on the homepage, we used a command line utility called
ffmpeg to clip, resize, fade, and encode some of our favorite drone video clips at a low bandwidth (see if you can guess which RECs they come from!). Adding the videos to the page was super easy using the standard HTML5 video tag, and getting them to auto-play and loop simply involved a couple of extra arguments. The video files live on our server, and the file names are randomized, so every time you refresh the page a new video starts playing.
Encoding video for webpages is standard practice these days, so to make life easier for others (and ourselves!) we typed up this workflow in a new TechNote entitled Encoding Drone Video for the Web. In it. you'll find a link to a Google Sheet that reduces the pain of using ffmpeg by generating the command line for you. Simply substitute your own video file name, start & stop times, and crop parameters, and the command is generated for you ready to copy-paste into a command window.
Modularizing Content for Flexible Placement
Good site design starts with thinking about your visitors, your content, and your communication goals. Transitioning to the new Site Builder template was a good excuse for us to jettison eight years and 70 pages of accumulated web content, and think about what really matters. We really like the new Site Builder template, and managed to get the look and feel we were hoping for with a modest amount of HTML customization and Google Sheets integration. Web pages are never ‘done', but future development will be a lot easier now that we have the building blocks in place. And we'd love feedback! Check out our new site if you're interested, and contact us if you have thoughts or would like to learn more.
Interested in knowing how people are using their geospatial skills in the era of COVID-19?
Last week, Harvard hosted a 10-panelist webinar (Center for Geographic Analysis Virtual Forum: Responding to the COVID-19 Pandemic with Geospatial Research and Applications) in which experts explored real-time datasets, displayed transmission models, and discussed data ethics of the current pandemic.
Innovative uses of datasets (such as mobile phone data and social media Tweets) exemplified how we can instantaneously map the spread of this disease at very high temporal and spatial scales. Yet, as many geospatial fanatics and drone pilots know, with high resolution comes high risk (of privacy concerns, in this case).
Data privacy emerged as a recurring theme throughout this webinar. Several panelists discussed confidentiality issues of pandemic mapping, which involve the spatial resolution at which data are analyzed – at the individual level or aggregated to a larger and more anonymized level: a scale at which houses, faces, and identities cannot be recognized in detail.
Dr. Caroline Buckee (Harvard), who has been building a research network in collaboration with data companies to aggregate, anonymize, and analyze COVID-19 cellular phone data, explained why the data she analyzes are disseminated at the county level, instead of at the house or neighborhood level. One reason for this is that we would not want punitive action to take place against specific neighborhoods or households if they are not following mandates such as the shelter-in-place policy, because we don't know if these individuals are attending work or are performing essential tasks. It is important that the data we share does not result in discrimination.
Dr. Doug Richardson (Harvard) echoed these sentiments and provided information on a platform he is developing to promote data security and confidentiality: Geospatial Virtual Data Enclave (GVDE). Ethical and security standards embedded into this portal can help ameliorate issues of data confidentiality.
Another theme that was brought to light in this webinar was how narratives can become lost in COVID-19 geospatial data. Dr. Mei-Po Kwan (Chinese University of Hong Kong) mentioned how every dot on the map has a story – and these stories are steeped in inequity and inequality. Dr. Buckee reminded us that each transmission has important geographic context, and considering different risk factors (such as age and socioeconomic status) and covariates (such as population density) are key to interpreting these data. Finally, Dr. Este Geraghty (Esri) introduced several resources that Esri provides to incorporate and honor the stories of those who have battled against the virus (https://coronavirus-resources.esri.com/pages/resources).
Below are notes on some of the methodologies and resources discussed in the webinar.
Twitter COVID-19 Hot Spots (April 7-14, 2020) created using the Twitter geo-search API (spatial query, no semantic query), and the Getis Ord Gi* function (hot spot analysis) on the attribute: #relevantTweets/#allTweets (per cell). The methodology (semantic machine learning) is described in this publication: https://www.tandfonline.com/doi/full/10.1080/15230406.2017.1356242.
A location / allocation model in Esri's ArcGIS Pro for finding optimal testing sites, treatment sites, and food distribution sites during the pandemic in San Bernardino County. This model determines the population demand by creating a risk surface (including transmission, personal susceptibility, exposure, socioeconomic factors), and then calculates optimal locations (layers: road network data, risk surfaces, supply chain constraints for staffing facilities / administering tests).
- Esri COVID-19 resources: https://coronavirus-resources.esri.com/pages/resources
- ArcGIS implementation of the University of Pennsylvania's COVID-19 Hospital Impact Model for Epidemics (CHIME): https://www.arcgis.com/home/item.html?id=37ad6eb0d1034cd58844314a9b305de2
- COVID-19 Spatiotemporal Rapid Response Gateway: https://covid-19.stcenter.net/
- COVID-19-related big-data analytics demo by Todd Mostak of OmniSci Technologies portraying interactive exploration of 16 billion rows of location data from mobile phones, from their cell phone partner X-Mode: https://youtu.be/Oeg3jF5xs6o?t=147
In summary, this was a very informative seminar that acknowledged several critical topics in geospatial data analysis, highlighting strengths of data sources and methodologies, along with concerns and shortcomings in the current state of pandemic mapping.