الثلاثاء، 15 نوفمبر 2022

NEWS TECHNOLOGIE

(Photo: Schaffer-Nishimura Lab)
Enter the Schaffer-Nishimura Lab at Cornell University, and you might stumble upon a mouse experiencing virtual reality (VR) for the very first time. They’re not having an immersive game night, but they are a part of the next best thing: neuroscience research.

Dr. Chris Schaffer and Dr. Nozomi Nishimura, two professors of biomedical engineering at one of New York’s most prestigious universities, have created a VR setup that allows them to study rodents’ reactions to virtual experiences. Though the exact purpose of this research appears to be under wraps, the duo says it’s a neuroscience project, which is on par with the Schaffer-Nishimura Lab’s typical area of study.

The setup is humbly powered by a Raspberry Pi 4, chosen for its accessibility and single-board simplicity. The Raspberry Pi runs the free and open-source video game engine Godot, which Schaffer and Nishimura used to create virtual scenes. Because your average VR headset is obviously far too large for a mouse’s tiny face, the researchers 3D printed custom display eyepieces and a case to keep the physical components together. Each eyepiece contains a 240×210 circular display and a Fresnel lens positioned 15 millimeters from the mouse’s face to provide an immersive 140-degree vertical and 230-degree horizontal field of view.

(Photo: Schaffer-Nishimura Lab)

To make the experience interactive, the researchers set up a spherical styrofoam treadmill that allows the mouse to move in any direction. As the mouse locomotes on the treadmill, the sphere’s movement is captured by optical sensors and an Arduino Due, which converts the movement data to something the Raspberry Pi can use to create camera changes in the virtual scene. The result is a VR world that moves in accordance with the mouse’s steps in the real world.

The virtual scenes themselves seem to vary. One is a virtual cliff avoidance scene, which involves a checkerboard landscape with false drop-offs. The researchers’ GitHub page also mentions “looming stimuli” of various sizes and the physical movement response to gratings of various spatial wavelengths.

GitHub doesn’t mention anything about biological sensors, meaning it’s unlikely Schaffer and Nishimura are testing the cardiovascular or hormonal responses to what potentially sound like fight-or-flight inputs above. A tweet from Matt Isaacson, a postdoc working in the Schaffer-Nishimura Lab, only states the project relates to “mouse neuroscience/behavior” along with a brief video of the VR setup in action. As for now, it looks like we’ll have to wait to find out what exactly these tech-savvy mice are getting up to.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/Dvgjy5n

ليست هناك تعليقات:

إرسال تعليق