Both the NASS Cropland Data Layer (CDL) and the National Land Cover Dataset (NLCD) released new versions in early 2014. Links for download are here:
- CDL:
- Users NASS can download the 2013, 2012, 2011, 2010, 2009, 2008 CDLs; the 2013 confidence layer and the 2013 cultivated layer from National Downloads (file sizes > 2gb).
- View their latest presentations in .PDF format.
- NLCD:
- The National Land Cover Database (NLCD 2011) is made available to the public by the U.S. Geological Survey and partners.
An interesting position piece on the appropriate uses of big data for climate resilience. The author, Amy Luers, points out three opportunities and three risks.
She sums up:
"The big data revolution is upon us. How this will contribute to the resilience of human and natural systems remains to be seen. Ultimately, it will depend on what trade-offs we are willing to make. For example, are we willing to compromise some individual privacy for increased community resilience, or the ecological systems on which they depend?—If so, how much, and under what circumstances?"
Read more from this interesting article here.
In 1998 Al Gore made his now famous speech entitled The Digital Earth: Understanding our planet in the 21st Century. He described the possibilities and need for the development of a new concept in earth science, communication and society. He envisioned technology that would allow us "to capture, store, process and display an unprecedented amount of information about our planet and a wide variety of environmental and cultural phenomena.” From the vantage point of our hyper-geo-emersed lifestyle today his description of this Digital Earth is prescient yet rather cumbersome:
"Imagine, for example, a young child going to a Digital Earth exhibit at a local museum. After donning a head-mounted display, she sees Earth as it appears from space. Using a data glove, she zooms in, using higher and higher levels of resolution, to see continents, then regions, countries, cities, and finally individual houses, trees, and other natural and man-made objects. Having found an area of the planet she is interested in exploring, she takes the equivalent of a "magic carpet ride" through a 3-D visualization of the terrain.”
He said: "Although this scenario may seem like science fiction, most of the technologies and capabilities that would be required to build a Digital Earth are either here or under development. Of course, the capabilities of a Digital Earth will continue to evolve over time. What we will be able to do in 2005 will look primitive compared to the Digital Earth of the year 2020. In 1998, the necessary technologies were: Computational Science, Mass Storage, Satellite Imagery, Broadband networks, Interoperability, and Metadata.
He anticipated change: "Of course, further technological progress is needed to realize the full potential of the Digital Earth, especially in areas such as automatic interpretation of imagery, the fusion of data from multiple sources, and intelligent agents that could find and link information on the Web about a particular spot on the planet. But enough of the pieces are in place right now to warrant proceeding with this exciting initiative.”
Much has changed since he gave his talk, obviously. We have numerous examples of Virtual Globes for data exploration - for example, Google Earth, NASA’s WorldWind, ESRI’s ArcGIS Explorer, Bing Maps 3D, TerraExplorer, Marble. (These virtual examples are made tangible with NOAA's terrific Science on a Sphere project.) We also have realized a new vision of the Digital Earth that includes much more than immersive viewing of data. Today’s Digital Earth vision(s) include analytics and expertise for solving problems that are often cross-discplinary and large scale. Additionally, we make much more use today than was anticipated in 1998 from sensor networks and the geoweb (e.g. volunteered geographic information and croudsourcing). Examples of this multi-disciplinary Digital Earth concept include Google Earth Engine (and its recent forest loss product), Nasa Earth Exchange, and our own HOLOS.
UC Berkeley is establishing a new institute to enable university researchers to harness the full potential of the data-rich world that today characterizes all fields of science and discovery. The Berkeley Institute for Data Science (BIDS) will be part of a multi-million dollar effort supported by the Gordon and Betty Moore Foundation and the Alfred P. Sloan Foundation.
The new 5-year, $37.8 million initiative was announced today at a meeting sponsored by the White House Office of Science and Technology Policy (OSTP) focused on developing innovative partnerships to advance technologies that support advanced data management and data analytic techniques.
The ambitious Moore/Sloan partnership, which also includes New York University and the University of Washington, will spur collaborations within and across the three campuses and other partners pursuing similar data-intensive science goals. The three PIs who lead the respective campus efforts – Saul Perlmutter at UC Berkeley, Ed Lazowska at the University of Washington, and Yann Le Cunn at NYU – will promote common approaches to form the basis for ongoing collaboration between the three campuses.
To provide a home for the new Berkeley Institute for Data Science UC Berkeley has set aside renovated space in a historical library building on the central campus in 190 Doe Library. The Institute is expected to move into its new quarters in spring 2014. In order to help address challenges related to creating and sustaining attractive career paths the new Institute will offer new Data Science Fellow positions for faculty, post-doctoral fellows, and staff to be shared with departmental partners across the campus. The new Institute will also offer support for graduate students, and organize short courses, boot camps, hack-a-thons and many other activities.
More information about specific BIDS programs will be forthcoming in the coming weeks. The new Institute will be launched at a campus event on December 12, 2013. If you or your students and collaborators are interested in participating in the Data Science Faire that day, please be sure to register at http://vcresearch.berkeley.edu/datascience/dec12-registration. The deadline is November 25, 2013.
For updates and more information, please visit http://vcresearch.berkeley.edu/datascience/overview-data-science and contact data science@berkeley.edu with any questions you may have.
The California Geoportal, officially launched in March 2013 (see here for related launch press release), augments and in some ways replaces the original Cal-Atlas statewide GIS data download webpage with a more simplified, smooth, and more intuitive website for all GIS related data in the state. You can now search or browse for GIS data by geography and any corresponding metadata using traditional search queries as well as by using a standalone webGIS interface. The portal also provides direct download links to some Oregon and Nevada state GIS datasets. The site acts as a GIS data repository for publicly available GIS data and related documents and maps from state agencies and local and regional governments. Rather than hosting the physical data, the site instead acts as a library of direct download links to datasets that connect directly to the author’s databases. The site also links you to other state GIS applications such as the California Coastal Geoportal and webGIS viewers from various state agencies.
See below for an informative video on how and why the portal was created and for highlights of features:
/span>/span>