Monday,
January 10, 2022
6:00 - 7:30 pm EST
Note: ETGS members will receive an email with info for logging into the meeting.
January Presentation
Archaeological Machine Learning: Using remotely sensed imagery to find and map
archaeological features
By
Leila Character (previously Donn)1, Tim Beach1, Cody Schank1,
Takeshi Inomata2, Agustin Ortiz JR3, Adam Rabinowitz4,
Tom Garrison1
1.
leiladonn@utexas.edu, Department of
Geography and the Environment, University of Texas
at Austin 2. School of Anthropology, University of Arizona 3. Underwater Archaeology Branch, Naval History and Heritage Command 4. Department of Classics, University of Texas at Austin |
Abstract
We are creating a series of supervised machine learning models to
predict and map the locations of unknown or unmapped archaeological
features using remotely sensed imagery. The goal of this work is to
create an efficient, cost-effective, and replicable method of
rapidly mapping archaeological sites, including those that are very
large. This project began in 2018 with the goal of creating a
targeted method of finding cave entrances at Maya archaeological
sites located in the dense tropical forests of Guatemala and Belize.
In 2019, we used a random forest classifier, airborne laser scanning
(ALS) data, and a training dataset of known caves to successfully
identify several previously undocumented caves in northwestern
Belize. Two of these caves contained archaeological materials.
Building on this work, modeling has been expanded to include other
types of hidden and obscured features that colleagues are interested
in studying. These include ancient Maya archaeological features in
Guatemala and Mexico, shipwrecks off the coast of the United States,
and ancient burial mounds in Romania. The models for the
archaeological features take ALS, sonar, and multispectral imagery
as input, are based on existing convolutional neural network
architectures, and make use of transfer learning. These models can
be used to create more accurate maps of archaeological features to
aid management objectives, study patterns across the landscape, and
find new features. Such models can easily be adjusted to identify
other types of features and accept different types of imagery as
input. This work seeks to make machine learning methods accessible
to non-computer scientists interested in study, management, and
conservation of archaeological heritage.
Biography
Leila Character
(previously Donn) is a PhD Candidate in the Department of Geography
and the Environment at the University of Texas at Austin focused on
using machine learning and remotely sensed imagery to find and map
archaeological features. Her work seeks to create an efficient,
cost-effective, and replicable method of rapidly mapping
archaeological sites, including those that are very large. Leila's
work includes partners from academia, federal government, and
private industry. Her current projects include a shipwreck
identification model being completed in partnership with the US
Navy, a Maya archaeological feature identification model being
completed in partnership with a group of archaeologists that work in
Guatemala and Mexico, and an ancient burial mound identification
model being completed with a group of archaeologists that work in
Romania. She has also partnered with an artificial intelligence
start-up on a project to create a multi-species tree and health
status classifier using hyperspectral imagery collected by drone.
Prior to beginning her PhD, she completed a master's degree in the
same department focused on the use of lidar and geoarchaeological
methods to study the land-use patterns of the ancient Maya in
north-central Belize. Her B.S. is in geology (from Sewanee: The
University of the South in Tennessee) with a minor in anthropology
focused on archaeology. Between receiving her B.S. and M.A. she
worked for five years as a geologist and environmental scientist in
Alaska, Texas, and Tennessee. You can contact Leila at
leiladonn@utexas.edu,
check out her website at
https://leilacharacter.wixsite.com/leilacharacter, and follow
her on Instagram for research updates at @leilacharacter.
Greetings! We hope you will join us for the next ETGS virtual meeting, and that you, your family, and your colleagues are staying healthy and well.
As a courtesy, please mute your cell phone or the microphone in your laptop/tablet to minimize background noise and feedback echoes. We will try to mute all participants until the presentation is finished. Please use the chat feature to type comments or questions during the presentation. We recommend that you send questions for the speaker to "everyone" so all participants can see the question. After the presentation, the speaker will answer questions. During this Q&A period, you may unmute if you wish to ask a question verbally.
We will create an attendance list based on the participant names we can see during the meeting. This is helpful for those who need to document participation to support Professional Geologist registrations. It is not always possible to tell who is participating, especially for those joining by phone, so please email your name to etgs@live.com to be listed on the attendance sheet. Let us know exactly how your name should appear on the list.
Thank you for your patience and understanding as we continue adapting to this virtual format. As always, we welcome and appreciate your feedback and suggestions for improvement.
Page updated December 14, 2021 |