Augmenting citizen science with computer vision for fish monitoring
MIT Sea Grant works with the Woodwell Climate Research Center and other collaborators to demonstrate a deep learning-based system for fish monitoring.
MIT Sea Grant works with the Woodwell Climate Research Center and other collaborators to demonstrate a deep learning-based system for fish monitoring.
By moving their hands and fingers, users can direct a robot to play piano or shoot a basketball, or they can manipulate objects in a virtual environment.
Read MoreA new hybrid system could help robots navigate in changing environments or increase the efficiency of multirobot assembly teams.
Read MoreA new approach could help users know whether to trust a model’s predictions in safety-critical applications like health care and autonomous driving.
Read MoreTorralba’s research focuses on computer vision, machine learning, and human visual perception.
Read MoreThe AI-powered tool could inform the design of better sensors and cameras for robots or autonomous vehicles.
Read MoreThe approach could apply to more complex tissues and organs, helping researchers to identify early signs of disease.
Read MoreA new approach developed at MIT could help a search-and-rescue robot navigate an unpredictable environment by rapidly generating an accurate map of its surroundings.
Read MoreMIT PhD student and CSAIL researcher Justin Kay describes his work combining AI and computer vision systems to monitor the ecosystems that support our planet.
Read MoreBy visualizing Escher-like optical illusions in 2.5 dimensions, the “Meschers” tool could help scientists understand physics-defying shapes and spark new designs.
Read More
You must be logged in to post a comment.