- Author: Sean Hogan
- Author: Maggi Kelly
July 2021 marks the fifth anniversary of Drone Camp, and thanks to an all-star lineup of presenters and instructors and a fantastic and diverse group of over 255 attendees from all over the world, it was a massive success this year. Initially launched in 2016 by the UC ANR, IGIS Statewide program, DroneCamp has now evolved into a multi-campus and industry collaboration, with a network of drone experts hailing from UC ANR, UC Santa Cruz, UC Berkeley, UC Merced, UC Davis, CSU Monterey Bay, and the Monterey Bay Drone Automation and Robotics Technology corporation. From 2016 to 2019, this event was held in-person at Davis, San Diego, and then in Monterey, CA.; however, in 2020 it was moved online due to COVID-19. The move indoors, to discuss a very outdoor-oriented topic, came with some challenges. We had to shelve our important (and fun) hands-on equipment and flight training, for example. Yet it had rewards. We were able to reach a more diverse and a greater number of participants, and widen the scope of content. Ultimately, 2020 was a great success, but in the process we recognized that it could be even better with additional help from our network of drone expert friends from around the state.
Like last year, we came together online for DroneCamp 2021 in July. Over 255 people joined from around the world to learn about theory, application, regulation, and data processing. We learned about the practical aspects of maintaining safety while flying, we took deep dives into various software workflows, and explored agricultural, forestry, and vegetation mapping examples. This year we hosted 5 Plenary sessions, and had some electrifying plenary talk sessions from cutting-edge scientists from around California: Crashing drones! Precision Agriculture! Citizen Science! Mapping aquatic environments!
Because the overall objective of DroneCamp remains to provide the most practical and comprehensive learning experience possible for attendees, we are working on an in-person, hands-on training day that is being scheduled for October, to be held in Monterey/Marina California. And, as the threat of COVID-19 subsides, additional in-person training sessions will be added around the state of California in the coming year.
DroneCamp is designed for a wide range of skill levels and interests, for those who are interested in using drones for anything other than non-recreation use (calling for a part 107 remote pilot's license), from complete beginners with little to no experience in drone technology, to intermediate users who want to learn more advanced data processing and analysis. Between presentations on contemporary applications of drones in environmental and agricultural research, and hands-on data processing and analysis exercises, attendees have the opportunity to fully immerse themselves in the following topics:
Flight Skills: Safe Launch and Landings, Basic Operations, Traversing and Avoiding Obstacles, Night Flying
Safety and Regulations: Safety and Regulations Overview; How to be a Good Visual Observer; Operating in Controlled Airspace
Hardware: Sensors, Platforms and Field Accessories
Data Collection: Mission Planning; High Precision Mapping
Data Processing: Stitching Drone Images with Pix4D, Agisoft Metashape, ArcGIS Pro, and OpenDroneMap; Analyzing Processed Drone Data in QGIS and ArcGIS Pro
Data Analysis and Management: Vegetation Analysis, Vegetation Analysis and Classification in ArcGIS Pro;Analysis of the Intertidal Zone; A case study of data management, from collection to storage and sharing of data outputs.
It was super fun and rewarding, and a great success for ANR and all the other collaborators. We built networks, increased collaboration, learned some very cool technical stuff, and got updated on current regulations, including the fact that you can now renew your 107 license easily here.
Some inspiring quotes from anonymous attendee reviews:
I was so appreciative of the extremely high caliber faculty/instructors that were recruited to give presentations, demos, and use of software. You all worked so well together to impart different pieces of expert knowledge. You all are brilliant and I'm inspired!!!
Loved it. We started off heavy which blew my mind but all the talks were so informative and fascinating. Really appreciate the diverse group of people you gathered together. Just wish it was in person! I would love to meet everyone.
Excellent presentation that allowed those of us with ArcMap experience to see the similarities and differences offered by ArcGis Pro. Again, a wonderful presentation (by a professional) that accounts for all the practical steps involved with data manipulation rolled into a final product.
Once I read through my notes, look at my screen captures and watch some of the presentations again, I will be able to structure my drone classes for my students. I'm developing a drone program from scratch for middle and high school students at a local charter school.
Consider signing up for the in-person training in October, and keep DroneCamp 2022 in your sights! Further information will be coming soon to the DroneCamp website, https://dronecampca.org/
The Fall 2020 UC Berkeley's Rausser College of Natural Resources Sponsored Project for Undergraduate Research (SPUR) project “Mapping municipal funding for police in California” continued in Spring 2021. This semester we continued our work with Mapping Black California (MBC), the Southern California-based collective that incorporates technology, data, geography, and place-based study to better understand and connect African American communities in California. Ben Satzman, lead in the Fall, was joined by Rezahn Abraha. Together they dug into the data, found additional datasets that helped us understand the changes in police funding from 2014 to 2019 in California and were able to dig into the variability of police spending across the state. Read more below, and here is the Spring 2021 Story Map: How Do California Cities Spend Money on Policing? Mapping the variability of police spending from 2014-2019 in 476 California Cities.
This semester we again met weekly and used data from 476 cities across California detailing municipal police funding in 2014 and 2019. By way of background, California has nearly 500 incorporated cities and most municipalities have their own police departments and create an annual budget determining what percentage their police department will receive. The variability in police spending across the state is quite surprising. In 2019 the average percentage of municipal budgets spent on policing is about 20%, and while some municipalities spent less than 5% of their budgets on policing, others allocated more than half of their budgets to their police departments. Per capita police spending is on average about $500, but varies largely from about $10 to well over $2,000.
We set out to see how police department spending changed from 2014 to 2019, especially in relation to population changes from that same 5-year interval. We used the California State Controller's Finance Data to find each city's total expenditures and police department expenditures from 2014 and 2019. This dataset also had information about each city's total population for these given years. We also used a feature class provided by CalTrans that had city boundary GIS data for all incorporated municipalities in California.
By dividing the police department expenditures by the total city expenditures for both 2014 and 2019, we were able to create a map showing what percentage of their municipal budgets 476 California cities were spending on policing. We were also able to visualize the percentage change in percentage police department spending and population from 2014 to 2019. Changes in police spending (and population change) were not at all consistent across the state. For example, cities that grew sometimes increased spending, but sometimes did not. Ben and Rezahn came up with a useful way of visualizing how police spending and population change co-vary (click on the map above to go to the site), and found 4 distinct trends in the cities examined:
Cities that increased police department (PD) spending, but saw almost no change in population (these are colored bright blue in the map);
Cities that saw increases in population, but experienced little or negative change in PD spending (these are bright orange in the map);
Cities that saw increases in both PD spending and population (these are dark brown in the map); and
Cities that saw little or negative change in both PD spending and population (these are cream in the map).
They then dug into southern California and the Bay Area, and selected mid-size cities that exemplified the four trends to tell more detailed stories. These included for the Bay Area: Vallejo (increased police department (PD) spending, but saw almost no change in population), San Ramon (increases in population, but experienced little or negative change in PD spending), San Francisco (increases in both PD spending and population) and South San Francisco (little or negative change in both PD spending and population); and for southern California: Inglewood (increased police department (PD) spending, but saw almost no change in population), Irvine (increases in population, but experienced little or negative change in PD spending), Palm Desert (increases in both PD spending and population), Simi Valley (little or negative change in both PD spending and population). Check out the full Story Map here, and read more about these individual cities.
The 5-year changes in municipal police department spending are challenging to predict. Cities with high population growth from 2014 to 2019 did not consistently increase percentage police department spending. Similarly, cities that experienced low or even negative population growths varied dramatically in percentage change police department spending. The maps of annual police department spending percentages and 5-year relationships allowed us to identify these complexities, and will be an important source of future exploration.
The analysts on the project were Rezahn Abraha, a UC Berkeley Conservation and Resource Studies Major, and Ben Satzman, a UC Berkeley Conservation and Resource Studies Major with minors in Sustainable Environmental Design and GIS. Both worked in collaboration with MBC and the Kellylab to find, clean, visualize, and analyze statewide data. Personnel involved in the project are: from Mapping Black California - Candice Mays (Partnership Lead), Paulette Brown-Hinds (Director), Stephanie Williams (Exec Editor, Content Lead), and Chuck Bibbs (Maps and Data Lead); from the Kellylab: Maggi Kelly (Professor and CE Specialist), Chippie Kislik (Graduate Student), Christine Wilkinson (Graduate Student), and Annie Taylor (Graduate Student).
We thank the Rausser College of Natural Resources who funded this effort.
Fall 2020 Story Map: Mapping Police Spending in California Cities. Examine Southern California and the Bay Area in detail, check out a few interesting cities, or search for a city and click on it to see just how much they spent on policing in 2017.
Spring 2021 Story Map: How Do California Cities Spend Money on Policing? Mapping the variability of police spending from 2014-2019 in 476 California Cities.
Like many things that start small and then go viral, the growth of DroneCamp has been welcome but challenging to sustain. The situation just got a whole lot better thanks to a grant from the UC Vice President for Research and Innovation.
DroneCamp was developed and launched in 2017 by the Informatics and GIS program in UC ANR. From the very beginning, the multi-day bootcamp style short course was designed to be highly collaborative and dynamic, with instructors coming from UC ANR, the UC Unmanned Aircraft System Safety center, and drone photogrammetry companies including Pix4D and ESRI. This was not only strategic for building long-term working relationships among campuses and the private sector, but also necessary. The expertise needed to use drones for mapping and data collection is scattered across departments and campuses, and no one unit has the resources to cover all the bases.
Over the next few years, the program grew by leaps and bounds as we went "on the road" to UC San Diego in 2018 and CSU Monterey Bay in 2019. Each year the class size, curriculum, and program complexity grew. In San Diego, the program was hosted by the UCSD Environmental Health and Safety office, with additional sessions by the UCSD Drone Lab; one of the most advanced drone labs in the UC system. DroneCamp 2019 represented a step increase in collaboration, thanks in large part to the efforts of UC ANR Vice President Glenda Humiston who connected us with Monterey Bay DART (Drones, Automation and Robotics Technology), a technology-focused economic development program based out of the nascent UC Monterey Bay Education Science & Technology Center (UCMBEST). The Institute for Innovation & Economic Development at CSU Monterey Bay, under Dean Andrew Lawson, hosted the event, opening their doors to a new academic building and an extremely modern computer lab. Flight training was held at the nearby UC NRS Fort Ord Natural Reserve, managed by UC Santa Cruz. The whole program was planned to coincide with an industry symposium implemented by DART, so that the day after the instructional sessions concluded we shifted gears to an invigorating all-day symposium on technology, policy, and regional economic development.
The scope of DroneCamp took another great leap forward in 2020, when the virtual format blew the lid off the scale and breadth of instructors, workshop topics, and participants. The talented pool of instructors in 2020 (most of whom are returning for DroneCamp 2021), hailed from across California and as far away as Ohio, including:
- UC ANR
- CSU Monterey Bay
- UC Merced
- UC Santa Cruz
- UC Davis
- UC Berkeley
- UC NRS
- Palomar College
- Open Drone Map consortium
The virtual format not only allowed the Planning Committee to expand the core instructional content covering the deep workflow of drone mapping, but we were also able to add panels in specific research applications including coastal environments and agriculture, as well as industry perspectives and technology updates. To manage the load of what was now a medium sized instructional conference, IGIS developed a custom content management system which included an automated registration dashboard to handle the many concurrent Zoom sessions. The beneficiaries of this growth included the over 330 people from 5 countries, but also the expanding network of instructors, researchers, and industry partners.
Growing however comes with costs, and although the success has been rewarding, the complexity of the program has become increasingly unwieldy for the ad hoc Planning Committee to strategically plan and coordinate. Which is why we're thrilled to share that a recent research innovation grant to UC Santa Cruz from UC Vice President for Research and Innovation Theresa Maldonado will support UCSC's growing drone program, including coordination support for DroneCamp. The new funding will allow Dr. Justin Cummings, a UAS operator and data analyst with the California Heartbeat Initiative, to spend more time coordinating the many moving parts of DroneCamp, and support the Planning Committee in continuing to deliver what has evolved over the last 5 years into one of the premiere drone mapping training events in the country.
The support and stability will also allow the Planning Committee to make inroads on some long-standing programming goals. These include developing a stronger needs assessment and evaluation framework, connecting with workforce development initiatives and continuing education programs, expanding fundraising and sponsorships, reviving the scholarship program for under-represented groups, organizing panels that bring together scientists, industry leaders, and funders, and packaging instructional materials into extension products that can be used in other settings as well as asynchronous learning.
The future of DroneCamp is bright, and 2021 should be another information packed week of drone technology and data analysis. So get your propellers on, clear some space on your hard drive, and plan to come to DroneCamp July 26-30, 2021.
DroneCamp Planning Committee:
- Sean Hogan, UC ANR IGIS Program
- Justin Cummings, UC NRS, UCSC
- Brad Barbeau, CSU Monterey Bay
- Chris Bley, InsightUp Solutions
- Becca Fenwick, UC NRS, UCSC
- Corey Garza, CSU Monterey Bay
- Pat Iampietro, CSU Monterey Bay
- Maggi Kelly, UC ANR IGIS Program
- Andy Lyons, UC ANR IGIS Program
- Michael Matkin, UC Santa Cruz
- Josh Metz, DART
- Mallika Nocco, UC ANR
- Brandon Stark, UC Merced
- Author: Andy Lyons
For the past several years, IGIS has been offering online Office Hours for the ANR community on a range of topics related to geospatial science and technology. In 2020, we increased our Office Hours appointments to 4 hours a week. Today the IGIS team is pleased to announce the addition of three new topic areas to our lineup of support for 2021:
Google Earth Engine (January-May 2021 only). Earth Engine is an online platform from Google that provides access to a vast array of imagery and a powerful scripting language for analysis and visualization. Annie Taylor, who taught two extremely popular Earth Engine workshops in 2020, is an Earth Engine whiz and will help you get over any Earth Engine bumps you may be facing.
Cal-Adapt. Cal-Adapt provides access to climate data and visualization tools from California's 4th Climate Change assessment. We work closely with the Cal-Adapt team on a couple of projects and love to talk about climate data. Need to work with Cal-Adapt data in a programming environment? IGIS's Shane Feirer and Andy Lyons have developed add-ons to work with Cal-Adapt data in Python, ArcGIS Pro, and R.
Advanced Google Sheets and Google Apps Script. Underlying Google Sheets, Forms, Docs, and other GSuite apps is a powerful scripting language called Apps Script. We use these tools a lot (in fact the sign-up system for Office Hours is driven by Apps Script), so if you want to do something out-of-the-box with Google Sheets, automate your work, or develop integrated workflows across applications, we might be able to help.
These new topics are in addition to our usual portfolio of support subjects, including:
GIS - desktop software, web GIS, mobile data collection, automation
drones - safety and regulations, equipment, mission planning, data processing and analysis, Pix4D, data management
spatial data and analysis - going from questions to workflows, finding data, spatial analysis
R - programming, working with spatial data in R, statistical analysis, Shiny apps
remote sensing - finding data, image processing and analysis, ENVI
databases - designing data structures, MS Access, SQL
We love Office Hours because it gives us a chance to meet new people, share what we know, and keep us sharp. So please don't be shy when you have a question or don't know where to start - sign up for Office Hours!
- Author: Andy Lyons
- Contributor: Sean Hogan
- Contributor: Shane Feirer
- Contributor: Maggi Kelly
The 2020 COVID19 pandemic changed the way educators work across the board. Like many groups, we had to quickly retool our geospatial workshops from a largely in-person format to online delivery. We quickly learned that simply replacing the overhead projector with a webcam and Zoom was not going to cut it. It's been an interesting journey, with lots of lessons learned, some surprising benefits, and a number of ongoing challenges and frustrations.
In this two-part post, I reflect upon lessons and aspirations for a very specific type of online training: live workshops focused on showing people how to use GIS, photogrammetry, and statistical software. These reflections stem from several workshops I was part of in 2020, including our first ever virtual DroneCamp, a new workshop focused on using climate data for community resilience and adaptation planning, a workshop on working with climate data in R, and two 12-hour workshops on geospatial analysis with R I taught in collaboration with SCGIS and BayGeo. Some of these lessons are specific to this new online technology, while others go back to age-old best practices for effective teaching.
In Part I (this post), I review some of the lessons, techniques, and conversations about training we had in 2020. In Part II, I discuss some aspirations and directions for 2021. But first let's review the particular requirements and challenges for this specific type of online training.
The Curious Circumstances of Live Online Software Workshops
Several characteristics of live virtual workshops frame the challenges we face:
- One-time engagement. Unlike a course that meets repeatedly over many weeks, we generally have a single point of contact with workshop participants. Even when a series of workshops are thematically connected, they are almost always offered as standalone modules, so we can hope but not expect that people have attended previous workshops.
- Diverse backgrounds. Also unlike a traditional classroom, where you can mostly assume students have minimum prerequisites, workshop participants generally have a wider range of backgrounds both in terms of their familiarity with the content (e.g., GIS concepts) as well as the specific software tools being used. Particularly with 'Intro' workshops (which we do a lot of), even basic computer skills like unzipping a zip file can not be taken for granted.
- Hands-on emphasis. The primary goals of our workshops are to impart practical skills, so the format requires a strong hands-on software component. There's a huge demand for hands-on software training, and many, many benefits for the learner, but incorporating hands-on practice also adds a lot of moving parts.
- People connecting from home. Even before COVID, most participants in virtual workshops were joining from home. This generally means people are connecting from a laptop of unknown specifications, using a slow to moderate internet connection. Computer hardware, workspace distractions, and the need to multi-task are generally all over the map when people are connecting from home, and there's not a lot we can do to influence these factors.
- Voluntary, self-motivated participants. The people who attend our workshops are there because they want to be. In many cases they're actually paying something out of pocket. Self-motivated students are delightful to work with, because they're already curious and see value in the content. But it also increases expectations. Unlike a traditional classroom environment where accountability is primarily upward, in adult learning we are accountable to our students. If our materials are not clear or coherent, we definitely hear about it.
Workshop Goals and Design
What is the value of live online instruction anyway?
Unlike a traditional classroom, online instruction forces us to rethink our value proposition because all of sudden there's instant competition on par with what we're delivering - the vast amount of high-quality pre-recorded content out there. Groups like ours have to ask ourselves why would someone sign-up for our live online workshop, when there are hundreds of hours of video and online lessons available on-demand, and often for free or a very modest subscription. Should we just save ourselves the trouble and point them to existing resources?
Online instruction forces us to rethink our value proposition because all of sudden there's instant competition on par with what we're delivering - the vast amount of high-quality pre-recorded content out there.
Having taught numerous workshops both in-person and online, as well as recorded many hours of video tutorials on computational modeling, I see a number of niches for live instruction that recorded on-demand content will never be able to match.
1. Helping beginners get up to speed. Beginners often need to hear concepts presented in different ways, with lots of examples, and the opportunity to ask questions about fundamental terms and concepts. For this audience, live presentations from an experienced instructor, coupled with a generous number of check-ins and opportunities for Q&A, are an efficient and effective way to move up the learning curve.
2. Reaching verbal learners. Many people are verbal learners, and cement their understanding by echoing and exploring concepts they've just encountered. Live instruction, when done well, can provide opportunities for feedback on both the breadth and depth of their understanding from peers and the instructor. This is particularly helpful when people are digesting new ways of thinking or exploring how new tools might be useful in their world. This is harder to do with pre-recorded content. One should always introduce important concepts before the technical stuff comes up, and pre-recorded content can incorporate glossaries, etc. This helps of course, but there is still no opportunity for Q&A and some learners may get left behind.
3. Exploring applications. Another learning goal that benefits greatly from live instruction concerns how to apply a technology to a particular domain. "How GIS can help you manage your ranch" would be an example of this type of learning goal. A workshop mixing basic concepts and tools, case studies, open-ended discussions, and perhaps group feedback on participants' action plans might be good tasks to incorporate for this type of goal.
4. Planning to engage. It's worth noting that while live online instruction can facilitate discussion, it is by no means guaranteed unless built-in by the instructor. This means doing things like capping the student to instructor ratio, building in pause points to address questions and have discussions, and structuring exercises to encourage interaction. One of my best virtual workshops this year was one in which the exercises were done in small groups in Zoom breakout rooms. Each breakout room had an instructor in the wings, but the students basically talked through the exercise and helped each other work through the material. Contrast that to a more traditional approach where the instructor gives an intro presentation for 45 minutes and then prompts to the students to individually work through a long exercise.
5. When live instruction loses out to pre-recorded. The higher up you go on the learning curve, live instruction starts to lose its edge. Intermediate and advanced users know what to ask Google or YouTube, and are knowledgeable enough to follow a guide.
Niche topics are another domain where the scales probably tip in favor of pre-recorded content. I've already decided the tutorials for a R package I'm developing to import climate data into R (an important but decidedly niche task) will be primarily on-demand recorded videos. Coupled with those however will be an extension of our online office hours, which I've found to be a much better way to help intermediate and advanced users work through challenges that can't be resolved by a Google search.
The higher up you go on the learning curve, live instruction starts to lose its edge.
What to do about the registration fee?
The question of registration fees for training is always an interesting conversation. When done thoughtfully, conversations about how much to charge delve into the host organization's mission, financial realities in the short and long term, organizational branding and outreach, and the needs and impact on related activities. More often than not, these discussions illuminate both tensions and synergies between different goals and perspectives, and the final result reflects some combo of necessity and compromise.
In my experience, there are a half dozen principles, or logics, people commonly invoke when thinking about how to price a workshop:
1) Make it free
Making workshops free has great appeal at many levels. It aligns with the values and mission of public universities to provide training and research for the masses. Since most of us are already paid by taxpayers directly or indirectly, training is something we can do to give back and make all the effort we put into research and innovation pay off. Technology is also liberating, so making it available to everyone is a way to reach those parts of society that need it the most.
Despite all of the above, we've come to conclude that charging nothing is rarely a good idea. In both in-person and online training, we've seen the worst attendance rates at workshops where there was no registration fee. Not surprising really. It's easy to sign-up for events that bear no cost, no commitment, and no risk. We all do it. But that makes it equally easy to blow off when something else comes up at the last minute. I put a lot of effort into preparing for workshops, so even if it was very successful, if only 5 out of 30 registrants show up I feel like my time has not been well spent.
Charging even a nominal amount like $5 will not only increase the attendance rate, but increase the odds that the right people show up. That small fee will get people to read the workshop details a little more closer, and decide if it's really worth their time. They're also more likely to do any preparation tasks you send them, such as installing software and downloading data.
Charging even a nominal amount like $5 will not only increase the attendance rate, but increase the odds that the right people show up.
Of course there are tradeoffs and limits to charging even a nominal fee. In our case as a small unit within a public university, just the modalities of collecting money has a lot of rules to follow. We're fortunate that we can outsource this task to another unit (thank you PSU!). If we had to do it ourselves, collecting small fees may not even be worth the effort. Also when we teach on one of our own campuses, we can't really charge students to attend (this is also where our attendance rates are most unpredictable).
We also don't want cost to ever be a barrier to people who need the training but lack the resources. Our go-to model has always been to always provide a fee waiver or scholarship application when there's a fee involved. This isn't hard to conceptualize or incorporate, but it's more work for someone to plan and execute (and probably not what the instructor wants to focus on).
2) Meet financial necessities
Sometimes financial necessity is the major driver behind revenue goals. If an organization's revenue model depends on income from training, those targets will certainly be front and center. Salary support is likely to be a major factor in these cases. Sometimes the training program itself may require a dedicated instructor or coordinator, although this seems less common in the geospatial world than other fields. Another common case is when some staff are on soft money and there are shortfalls or gaps in funding. There could also be another project within the organization, perhaps a pilot project, that is under-funded and needs financial support to keep going. For public institutions like ours, all of this is framed by the long-term trends in state funding, which lately seems to fluctuate between flat in good years and downward in bad years.
3) Recoup implementation costs
Another common goalpost for setting revenue goals is how much it costs to run the workshop. For in-person events, this typically includes the cost of renting a venue, snacks, and transportation for the instructors. On a couple of occasions we've also been asked to pay for a computer lab administrator's time to install software. The costs for virtual workshops are generally much lower, but could include the cost of your virtual platform. As of this writing, a 500-person Zoom webinar license costs $1400 / year. Other online platforms will be in the same ballpark, possibly more if they're tailored to virtual instruction. If you choose to hire someone to support registration, marketing, or back-end support, those will be additional costs. An expense we encountered for the first time in 2020 was the need to hire American Sign Language interpreters for one of our virtual workshops, at the rate of $140/hour.
Recouping implementation costs has always been the 'floor' of the revenue goals for our workshops, because we don't have a budget line for training operations. In many cases we stop at cost recovery because we or our collaborators want the training to be as accessible as possible. In the same spirit, we keep costs down to a bare minimum by collaborating with partners who can contribute meeting space, computer facilities, support staff, or even accommodation.
4) What will the market bear?
Another approach to thinking about price is to research the going rate for similar types of training. This isn't always practical because what we offer is typically not available elsewhere (a good thing otherwise we probably shouldn't be doing it), and the "market" we're targeting may be very different than what other programs are serving. Nonetheless there are many groups around the country and the world who provide geospatial training of various forms, and it can be informative to do a little price research to get a benchmark.
5) Prioritize people, not dollars
Most groups that offer geospatial training have other core activities, such as funded research, consulting services, or software development. In many cases, the real value in providing training is not a revenue stream but the relationships it develops and the market share it builds. Reaching people may therefore be a more important goal than maximizing revenue. And not just any people, it has to be the right people. This will probably favor keeping the registration fee relatively modest, but you might want to bump it up if you're aiming for a smaller niche audience.
Research-centric groups like ours fall into this general category. Providing training is part of our mission but not the core. Our bread-and-butter work is technology innovation and bringing geospatial expertise to research and extension collaborations. Hence goals for all our training initiatives include a bit of marketing, expanding and strengthening our networks, and gathering ideas for new research and extension projects. This is our core business, and our and training has definitely helped develop it. Due in part to our exposure through workshops, we've become a hub for both knowledge and people in our core areas of expertise, and have had several funded projects that started out as a side conversation at a workshop (something I wish Zoom was better equipped for!).
6) Can we get donations or sponsorship?
Aside from registration fees, donations and sponsorship may be able to support workshop costs. This requires further strategic thinking, because donors and sponsors will only be interested in supporting communities and events that advance their interests. In the geospatial world, this will likely include marketing opportunities for a sponsor's goods and services, so the audiences have to match. Another goal for many private and non-profit organizations is expanding technology to underserved audiences. These are elements that can't be tacked on to an agenda two weeks before the workshop is held. They require thoughtful planning and explicit efforts from the very beginning in how the workshop is framed, designed, and marketed.
We've dabbled in soliciting donations and sponsorships over the years for DroneCamp, with modest results. Corporate sponsors haven't been a great fit, because they're looking for exposure to large numbers of potential customers which our in-person workshops simply don't deliver. That might change however with more online programming - over 330 people attended our first virtual DroneCamp 2020. Raising money for scholarships has been an easier sell, because the amounts are lower and what we provide is really valuable to several resource-poor and underserved audiences, starting with students. All of this requires a lot of strategizing and legwork, which ultimately may be the biggest bottleneck in raising this type of funding.
Techniques for Online Teaching
More Important than Ever: Engagement and Active Learning
Classroom teachers quickly learn that the keys to keeping students engaged and focused are active learning and a bit of performance. The same is even more true for online instruction. In addition to all the usual distractions and challenges of presenting complex material, online instructors have to contend with dozens of other windows to click on, temptation from other devices, and Zoom fatigue. A long-winded presentation in a classroom may result in some glossy eyes; that same dry presentation given online may result in losing your students and never getting them fully back.
Being an engaging presenter starts with your screen persona. The usual characteristics of being a good presenter translate well - personable, spontaneous, good use of intonation, a pace that is neither slow nor fast, a little comedy here and there, etc. I often don't feel naturally engaging, particularly when I've been scrambling to develop new content, but I've also learned that much of this is performative, can be practiced, and to some extent scripted.
However the biggest impact on the level of engagement in my experience is not the instructor's personality but the workshop structure and flow. Gone is my traditional 45-minute opening presentation. My general rule of thumb now is to get people doing something hands-on within 15 minutes. Even if its just running sample code they don't really understand yet, I aim to get them quickly interacting with the software and keep them busy after that. I can guarantee this is more interesting than listening to my voice, but it will also motivate them to watch the next batch of slides or demo.
The biggest impact on the level of engagement in my experience is not the instructor's personality but the workshop structure and flow.
There are many other tried and true techniques to make online instruction more engaging. Polls, white board activities, panel discussions with audience questions, etc., are all fairly easy to integrate in platforms like Zoom. I'm a big fan of Google Docs and Etherpad (an open source version of Google Docs) for more collaborative work that gives everyone a chance to contribute. I have yet to do collaborative diagramming using tools like Miro or Google Jamboard but these can help collect and organize ideas or topics from a group.
Teach According to How People Learn
The highlight of many software workshops are the hands-on exercises, where a guided workflow culminates in a satisfying output almost by magic. Students love it, but we all know that no matter how satisfying, completing an exercise by itself does not represent successful learning. Getting students to the point where they can apply the concepts and tools to solve different problems is far higher bar to meet, and requires us to teach according how people actually learn.
There are many aspects of aligning teaching with learning, but some of the most pertinent ones for software instruction are i) understanding your students' background and aspirations, ii) constantly flipping back and forth between higher level concepts and the nitty-gritty of an application, iii) dishing out material in mentally digestible chunks, and iv) getting continuous feedback in the form of assessments. This list is just scratching the surface; tomes have been written about effective instruction. A good overview on the subject I have bookmarked is a recent webinar What Every Data Scientist Should Know About Education, by Greg Wilson, co-founder of Software Carpentry and now at RStudio.
Software setup: An old headache that just got worse
One of the biggest headaches in any software centric workshop is getting everything setup on the computers. Geospatial workshops in particular have several requirements - software has to be installed, licenses obtained and activated, and workshop materials including data distributed. One of the biggest benefits of conducting workshops in computer labs is that this setup process can be dealt with beforehand, however in a virtual setting that is no longer an option.
This past summer I was a helper in a 3-hour live online workshop where literally the first hour was spent on setup. The 'normal' way of installing a critical extension had recently broken due to a update (care to guess which software company?). More than 1/2 of the participants couldn't proceed, so the instructor had to walk through an alternative setup process. Spending 1/3 of the workshop on setup is a bit extreme, but even in a 'good' workshop there is always a handful of people who get behind from the get go because they're working through setup problems. Instructors have a tough choice to make when this happens - take time to help a few people who can't get started at the detriment of everyone else, or keep going and hope they catch up.
Our group has learned that setup issues are best dealt with proactively, before the workshop starts. Starting with the workshop description, I make it clear that i) setup is a required task, ii) is the responsibility of each participant, and iii) needs to be complete before the workshop starts. This is coupled with very detailed setup instructions that are shared at least a week in advance, and pre-workshop tech support for anyone who needs help.
Our group has learned that setup issues are best dealt with proactively, before the workshop starts.
DroneCamp 2020 was one of our biggest challenges ever regarding setup. There were six hands-on workshops involving five large photogrammetry and GIS programs (Pix4D, Metashape, OpenDroneMap, ArcGIS Pro & QGIS). Data downloads for some of the workshops were several gigabytes. Anticipating challenges, we first held full practice workshops with each instructor 2-3 weeks in advance, to test everything including setup. These lessons were incorporated into detailed installation instructions were posted on the website over a week advance. We also had two days of drop-in office hours the week before for anyone having trouble.
To provide some structure and incentive for completing setup, we also assigned a modest "Step Zero" exercise for each program to show that they had successfully installed the software and loaded some data. Although we stopped short of collecting these before the course began, we made it quite clear in our messaging that if people didn't have their computers setup in time for the workshop we weren't going to be able to help them, but they could still learn something by watching the slides and demo. This combo of clear communication about expectations, detailed instructions, pre-workshop tech support for those who need it, and plenty of time to get things setup generally paid off. None of the workshops got bogged down with setup issues, and instructors were able to jump right into the main content.
The Single Screen Problem
A typical live software workshop requires participants to juggle a minimum of 4 windows. You have the instructor's webcam, the instructor's screen share (slides or a software demo), the participant's application (e.g., ArcGIS Pro or Pix4D), and a written exercise guide. On top of that, everyone needs access to the Zoom chat window, the participants pane to raise their hand, the Zoom control bar to mute and unmute themselves, and possibly a note taking application.
In the best case scenario where someone has one or even two external monitors, that's a lot of windows to manage. For participants who have nothing but their laptop, it's almost hopeless.
I try to mitigate the single screen problem in my workshops proactively. Starting with the workshop description, I strongly encourage people to use a second monitor. I've never made it an outright requirement, although I've been tempted to do so. I also provide tips on juggling multiple windows on a single screen, including third party utilities that allow you to pin certain windows on top, and utilities that can save a configuration of window sizes.
Because there are so many windows to juggle, I try to be cognizant during the workshop that not participants may be seeing the slide I'm talking about or the application window I'm sharing. I constantly verbalize which window or slide I'm on, where I'm clicking, etc., and whenever we switch gears I pause to give people a chance to get the right window open. For my R workshops, I've also modified how students do the short exercises. I used to have them copy-paste prompts and starter code from a slide deck into RStudio. That kept them active (a good thing) and ensured they were following the slides as well. However that gets cumbersome very fast on a single screen. I've since switched to R Notebooks, where the prompts and starter code are ready-to-go, and I can use markdown text to explain the task.
Because there are so many windows to juggle, I try to be cognizant during the workshop that not all participants may be seeing the slide I'm talking about or the application window I'm sharing.
The Small Screen Problem
Related to the single screen problem is the small screen problem. I have a beautiful 26" high resolution external monitor, so I never have problems with legibility or enough real-estate for multiple windows. But the participants viewing my screen share may be working on a 14" laptop. Zoom offers two options when the screen being shared is a different resolution. They can either automatically resize my high-res screen share (and get a magnifying glass to read the tiny font), or crop it and just watch part of my screen share (hopefully the right part).
I don't know of a great solution, but when I teach I try to mitigate the small screen problem by first reducing the resolution of my beautiful 26" external monitor down to a modest 1080p, even though its capable of considerably more. That makes it look slightly fuzzy to me, but I know it will come out better for most of the people watching my shared screen. To make it easier to follow my mouse, I also increase the mouse pointer size, enable a feature that allows me to visually highlight the mouse location when I press the control key, and sometimes turn on mouse trails. I've seen other presenters, particularly on YouTube, use Windows Magnifier or a similar utility like ZoomIt to help people follow details.
But most importantly, I try to tackle the small-screen problem by simply slowing down and verbalize each step when I'm doing a demo. I also generally invite them to just watch a few steps, and then pause so they can repeat those step on their own, rather than try to keep pace with me. As we go I ask how people are doing so I can adjust my pace. There may be one or two who just get hopelessly behind, and others who might be bored because I'm going too slow, but my goal is to make sure most people are able to follow.
Part II: Goals and Directions for 2021/h3>/h3>/h3>/h2>/h3>/h3>/h2>/h4>/h4>/h4>/h4>/h4>/h4>/h3>/h3>/h2>/h3>/h3>