A University of California, Berkeley robotics lab is growing AI programs for polyculture gardening as a part of AlphaGarden, an AUTOLAB mission that desires to seek out out if people can prepare robotic management programs to totally automate a polyculture backyard of edible crops and invasive species. The AUTOLAB robotics laboratory is maybe finest recognized for creating the DexNet system for robotic greedy.

AUTOLAB director Ken Goldberg mentioned the purpose is to seek out out if AI can be taught a perform as advanced as polyculture gardening, or farming with a number of species of crops rising alongside each other, as an alternative of monoculture rising, a single-crop technique generally practiced at this time.

“I think that’s an open question. I don’t know if we can,” he mentioned. “It certainly might be interesting to be able to have a fully automated garden. In my own view it’s probably unlikely to be viable as a real functioning productive garden. I think that it’s going to be very hard to learn, and that’s the art side of the lesson, which is that nature is very complex, and that we can put some very complex machinery on it, but it’s not going to necessarily open up and be controllable.”

At launch, eight college students within the AlphaGarden Collective pruned and planted alongside a robotic FarmBot Genesis system that automates water shelling out inside a greenhouse at UC Berkeley. As a results of COVID-19 forcing the closure of the college, college students will now give attention to polyculture backyard simulations and fashions as an alternative of working in a UC Berkeley greenhouse.

UC Berkeley robotics lab wants to fully automate a polyculture garden

Participants within the mission need to be taught from the real-life backyard as nicely, as a result of simulations can solely get so near predicting actual life, and polyculture gardens could be unpredictable.

“For every real garden, we have 100,000 or millions of gardens that can be generated,” Goldberg mentioned. “This runs at 100,000 times faster than nature so you can accelerate time dramatically, and for each one you can say, ‘Well, if I tweak these parameters in my control policy, here’s what the outcome will be in terms of how often you water, in what conditions you water, etc.’”

AlphaGarden Collective members began the primary rising cycle on January 1 and, attributable to restricted entry to the college greenhouse, plan to start the second cycle in April or May. The mission will proceed for the approaching two to a few years, he mentioned.

Goldberg says AlphaGarden is each an artwork and science set up meant to distinction the complexity of nature with the complexity of AI.

“AI is incredibly complex — you’re throwing a lot of a technology and theory and processing at these problems — but when faced with the complexity of just a single polyculture garden, it’s meeting its match because gardens are incredibly complex,” he mentioned.

AlphaGarden is a part of The Question of Intelligence, an exhibit of greater than a dozen different tasks at The New School in New York City that examines distinction between human and machine studying and the influence of automation on human senses. The exhibit was initially scheduled to run till April however The New School exhibit is now closed because of the coronavirus pandemic.

Harvard MetaLAB senior researcher Sarah Newman is an advisor to the mission and referred to as AlphaGarden a mission to review the character of variety and discover the constraints of AI within the context of ecology and sustainability.

“AlphaGarden foregrounds the beauty of nature and exposes the limitations of AI and robots,” she mentioned. “There will always be a distance between simulation and reality.”

AlphaGarden resembles TeleGarden, a gardening on-line webcast mission Goldberg led from 1995 to 2004. With AlphaGarden, regardless of restricted entry in the meanwhile, a time lapse visualization is up to date day by day to exhibit progress.

AlphaGarden makes use of {hardware} from FarmBot rigged to an overhead gantry crane, along with cameras for knowledge assortment and sensors for measuring issues like soil measurement.

“We just took that [FarmBot] off the shelf, but where we’re coming in is by putting a camera way up top overhead, and that’s the global image that you see from a bird’s-eye view; then we’re basically taking images every day, and we’re basically trying to monitor the state of the garden every single day so that we can see how it evolves, and then start to understand what the effect of actions are,” Goldberg mentioned.

AlphaGarden additionally builds on the PlantCV pc imaginative and prescient system for leaf identification to acknowledge particular crops within the backyard of herbs and vegetation. Such leaf identification programs can be utilized by plant growers to observe development with plant phenotyping.

Demo of the robotic, simulations, and pc imaginative and prescient system in motion can be found on alphagarden.org.