Bee My Guide: Sensors & Interactivity
The SPN sub-team’s main purpose is to create programs that detect player pose and send and receive data about players’ progress as they go through the exhibit. Throughout all four sections of the Bee My Guide Exhibit, SPN’s programs serve as an interface for all sub-teams to interact and complete their respective tasks.
Section 1
In the first section users control Mr. Bee’s movements in a game scenario where they navigate through attack clouds.
SPN utilizes two webcams to track the user’s body movements in 3D. The player’s body serves as a “human joystick” to guide Mr. Bee’s flight.
SPN has two methods for detecting user pose. First, there is the machine learning approach where data is used to train the model beforehand, and then during the exhibit, the program can identify different poses. The second approach is the geometric approach where the program does various calculations to identify the pose the user is making during the exhibit. The input to either approach are the coordinates of certain key points, i.e., major joints or key locations on the body, as identified by MediaPipe Pose https://ai.google.dev/edge/mediapipe/solutions/vision/pose_landmarker.
Section 2
In the second section, the player points in certain directions to direct Mr. Bee on the screen (like a cursor) and cause physical flowers just below the screen to bloom.
As in Section 1, two webcams are processed by MediaPipe Pose to identify key points. A python program determines the pointing direction, based on the key points.
When the player points to the correct flower, the program sends a message to cause that particular flower to light up and bloom.
Section 3
In the third section, Mr. Bee flies into a mushroom forest where the user attempts to match the pose of a white figure.
SPN uses MedipPipe Pose to process a single webcam to identify the player key points. The key points are compared to the pose, and if close enough, allow Mr. Bee to make it through the opening.
Section 4
After each of the previous sections, the player was given a game code. In Section 4, the player must point to honeycomb cells that match the game code numbers.
Similar to Section 2, SPN uses two webcams and pointing recognition. If the player gets the cell selections all right, they win and Mr. Bee is reunited with his family.
All sections
All sections have a database to keep track of player game status during the exhibit. The database is also able to generate randomized game codes for Sections 1 through 3 to be used in Section 4.
All of the sections have LIDAR sensors, sensors that use infrared lasers to determine the range of every object in their field of view, to determine whether a user is too close to the exhibit. They will also track occupancy data for general player progress.
SPN is also in charge of ticketing system where users can press a button to get a printed barcode. Users can then use that printed barcode to activate each section by scanning their ticket. The barcode serves as a random seed to give each ticket holder a different experience.
Lastly, SPN is in charge of Ethernet connection to transport information across the different parts of the exhibit. The Bee My Guide exhibit uses both TCP/IP and UDP protocols for communication between exhibit devices.