CEDEC2023, a large-scale conference for game developers, was held both on-site at Pacifico Yokohama North and online. We will deliver a session report for “FINAL FANTASY
This session featured Itsuto Sato, animator at Square Enix’s Third Development Business Division, and Eiji Takada, a technical artist.
Mr. Itsuto Sato on the left, Mr. Eiji Takada on the right
Can motion capture be used for remote work?
Since the start of the coronavirus pandemic in 2020, we have had to set up a remote work environment for motion capture. Normally, humanoid characters are recorded in a large motion capture studio using a special capture suit. Furthermore, professional actors participated and filmed their powerful performances for use in the game. The studio even uses multiple cameras and optical motion capture systems.
In order to promote a work style that takes advantage of the unique advantages of remote work, we started working on an environment where team members who do not live near Tokyo, where the studio is located, can record and direct. Therefore, the content of this session was to try a new recording method that is compatible with a remote environment and a motion capture studio.
Here we show off the motion capture. The angle of view was not perfect because we were filming in a small space, but to prevent noise in the apartment, we laid out cushions and used a futon to shoot the sword-sheathing animation.
There are various ways to build a motion capture system at home, but this time we used two types: “Preception Neuron Studio” that uses an acceleration sensor and “MediaPipe” that creates capture data from videos. Unfortunately, the full-tracking “mocopi”, which has six sensors attached to the body, was released in early 2023, when development was almost finished, so it has not been verified yet.
“Preception Neuron Studio” performs motion capture with a sensor consisting of a gyroscope, acceleration sensor, and magnetometer. This device was chosen because it can be installed by one person.
The sensors are rechargeable so there is no need for cords to get tangled, but it takes about 6 and a half minutes due to the large number of sensors. Calibration is completed by performing four poses: standing alone, T pose, hands together pose, and pinching object pose. Setup can be done with Axis Studio.
The flow of inputting the shooting data into the rig is as follows: Output the animation in Axis Studio, which receives the capture results → Retarget with Maya HumanIK → Maya HumanIK’s rig input tool (use the 3D model for confirmation) → In-house rig system There are 4 steps.
Here he shows off his sword sheathing motion. From the temporary model to the model used in the game, we saw that the animation was implemented in an accurate manner. It was also a great success because even in a remote environment, the situation could be shared and displayed on the screen. He said that it was a practical and fun method that gave him peace of mind because he could check the shooting status in real time during the meeting, receive feedback, share the captured data immediately after the meeting, and immediately reshoot.
It is also possible to shoot animations of only the hands, but they did not use them this time. He says that in the future he would like to try shooting only various parts of the body, as he anticipates that it will become necessary to shoot close-ups of hands in cutscenes.
“MediaPipe” that can capture from videos – “mocopi” which was in the testing stage