Introduction
To understand why Myo exists and what problems it solves, see Why MyoSapiens?.
Key Concepts
Assets
Assets are files exchanged with the platform. You upload assets (like C3D files and markersets) and the platform creates output assets (like joint angles) when jobs complete.
Each asset has:
- A unique ID for referencing it in API calls
- A purpose indicating what type of file it is
- A status showing whether it's ready to use
- Metadata like filename, size, and checksum
The SDK automatically detects the purpose from file extensions. See the File Formats guide for supported formats and purposes.
Jobs
Jobs are processing tasks that run on our cloud infrastructure. When you want to retarget motion data to a character, you create a retarget job.
Job types:
retarget- Map motion capture data onto a character skeleton
Job statuses:
QUEUED- Waiting to be processedRUNNING- Currently being processedSUCCEEDED- Completed successfully with outputFAILED- Something went wrongCANCELED- Job was canceled
The SDK provides a simple retarget() method that uploads, runs, and waits in one call, or create_retarget_job() for more control. See the Retargeting Tutorial for a complete example.
Retarget job parameters:
- Required: Tracker (C3D file path or asset) and markerset (XML file path or asset)
- Optional: Character selection, scaling options, subject measurements,
export_glb,stream_status
See the Retarget Parameters guide for complete parameter documentation.
Output Format
Retarget jobs produce multiple outputs:
- qpos (.parquet) – Joint angles in configuration space
- xpos (.parquet) – Joint positions in 3D space
- model (.mjb) – MuJoCo model file
- motion (.glb, optional) – Motion as GLB when
export_glb=True
Use result.download_all("out/") or result.download_qpos(), result.download_xpos(), etc. See the File Formats guide for complete details.
Next Steps
- Install the SDK to start using MyoSapiens
- Set up your account at dev.myolab.ai
- Follow the Retargeting Tutorial for a complete walkthrough
- Learn more about Why Myo? and our approach to motion processing