top of page
  • Which application possibilities does your solution support?
    ​​​​​​Physiotherapy & Rehabilitation Since our acquisition by Nautilus Inc. in September 2021, VAY has shifted its focus away from supporting external fitness and performance companies.
  • What data is VAY tracking?
    Speed / Velocity Reaction Speed Up to 30 data points on the body and around the body Angles Repetition Mistake Detection Range of Motion This allows VAY to measure all data points that conventional trackers and high-profile camera systems were able to, just faster, cheaper and more scalable.
  • Is it Open Source?
    It is not. VAY’s motion analysis is proprietary software that can be used only via licensing.
  • What does a client need to provide for a potential cooperation? How much work is involved?
    Integration is straightforward (1-2h). The VAY Motion Analysis Kits can be directly integrated into any application and is well documented. It is important to mention that we do not offer GUIs or audio-visual feedback systems, as this relates directly to user experience and we know that you know your users best. However, we like to know as much as possible about your product to perfectly support you with integration and building a good user experience. Please note we do not provide our own white lable application.
  • What are end user requirements?
    A camera, a processing unit, and an internet connection is all you need. Any camera will work, as our solution is hardware agnostic, meaning it is compatible with any camera system out there.
  • Do I need an expensive 3D camera or a camera system for the motion analysis to work?
    No, any RGB camera will work, allowing you to provide high-end motion analysis without having to spend big on expensive camera hardware.
  • Is there a mobile app?
    VAY does not operate in the B2C space and as such do not have an own dedicated app. We operate on a B2B SaaS model and as such integrate into the apps of our clients.
  • What platforms does the VAY API support? (Windows/Mac and Android/iOS)
    We support all platforms. We currently have iOS, Android, desktop (Windows/Mac/Linux), and web apps. If desired, we can provide additional APIs. We also always support with integration and share our expertise for the setup. A camera is required for the pose estimation to work.
  • How precise is the VAY motion analysis algorithm?
    Our human pose estimation already excels on benchmark data sets and is specifically trained for fitness exercises. We include prior knowledge about human anatomy and ensure temporal consistency. Any remaining inconsistencies are made up for by our motion analysis neural networks, which are specifically trained on different exercises also on imperfect human pose models during each exercise. Our computer vision algorithms are designed to resemble human vision. Therefore, precision is reduced in very dark environments.
  • Is the VAY motion analysis solution cloud-based or on-device?
    We currently offer only a cloud-based solution. For this, users will require connectivity.
  • Does the data provided correspond to a 2D or a 3D system?
    The use of our current API comes with a 2D output. Nonetheless, our algorithms are trained to precisely estimate even body parts which are further back in the image or hidden. We are eagerly working on 3D analysis and incorporate prior know-how of human anatomy to ensure high precision. Our latest networks show extremely good accuracy. We expect to release a full 3D real-time analysis early 2022.
  • Can the VAY tech do a real-time analysis? At what FPS and resolution is it supported?
    With max. 30 fps our system provides real-time (<0.1 s latency) analysis output. It’s still possible to analyze higher frame rates, but not in real-time. Our neural networks work on low-resolution images. The long-ago outdated VGA (640x480px) is sufficient.
  • In what programming languages is your product available for integration?
    Basically, any can be supported. Readily available are JavaScript, C#, Python, Java, Swift.
  • What data do we get after the VAY motion analysis? Do we get coordinates of tracked points or images already marked with tracked points? Do we get tracked joint angles? What other data can we get?
    Our product offers a variety of information on different levels: The computer model of the human body, i.e., the coordinates of all body parts. Specifically requested metrics. Examples include joint angles or angular velocities, distances between two joints, or velocities of joints. The in-depth comparison to a perfect execution at each time. The high-level analytics of repetition counting and grading, including a list of mistakes and repetition duration.
  • How is the proper execution of an exercise defined?
    Our internal experts together utilize their knowledge of movement together with research in movement science to define the proper movement form for an exercise. Key movement parameters are defined, which are tracked and integrated into the live feedback, giving users instant feedback on their form execution of a movement. Each exercise is constantly being refined, to add to the precision and movement tracking.
  • Can I define my own exercises when using the VAY motion analysis algorithm?
    If you have your own experts and want to adapt to their form preferences, we can adapt our movements. Furthermore, we are developing a new platform that allows you as our client to independently add new exercises or adapt existing ones to your liking. We assist in the development process and help to finetune.
  • Can I film myself doing a “perfect” form and have that be the benchmark?
    Not yet. We are evolving our tools constantly and plan to enable clients to perform an exercise and the algorithm taking that movement as the benchmark in the future.
  • How many exercises are there in the VAY motion analysis?
    The VAY exercise library is dynamic and constantly growing. Currently we have over 150 exercises defined, which are quickly increasing.
  • How much does it cost to use the VAY motion analysis technology?
    Pricing is specific to each product and your needs. We provide a tailored offering that we collaboratively define with you as our client.
  • Are there trial packages?
    We offer a non-commercial trial license for 3 months which includes our full support for integration and the development of new exercises. Contact us for pricing details.
  • When was VAY founded? How did it get where it is today?
    Joel started VAY in 2018 with the goal to create a fully virtual personal coach. The VAY Fitness Coach launched on the App and Play Store in June 2019 as the first end-consumer product using monocular human motion analysis. Shortly after, we realized that a split focus on this novel computer vision technology, movement science, and user experience/content/community building is not feasible. Our team decided to put all focus on combining computer vision and human biomechanics to a democratized professional motion analysis. With our novel way to teach a computer to understand human motion and movements, we are now building up a scalable movement library that our business clients can use in a plug-and-play manner. In September 2021 VAY was acquired by Nautilus Inc, a public fitness company based in Vancouver, Washington, USA. Our technology will help digitally transform Nautilus and deliver a high level of personalization to their product portfolio. Coming full circle VAY will support in providing real-time feedback, repetition counting, form tracking, and customized digital coaching.
  • How does VAY compare to AppleAR kit, with the exception of multi-device compatibility?"
    Our system shows higher level of accuracy, speed and reliability, as it tackles a different use case than the ARKit (Motion Analysis with VAY vs. AR Features like overlays with ARKit). Further, as mentioned in the question, our system works hardware-agnostic
  • How would the latency impact the accuracy of the response time? How does cloud affect latency? Can the latency be determined by milliseconds?
    The accuracy is not impacted by the latency, as the system is still able to make use of all the information. For visualizations, we offer predictive algorithms such that visualization is actually real-time and completely smooth. For precise analysis, we use several frames to suppress outliers, and present the feedback with a delay between 0.05 and 0.2s (our tests have shown that 0.2s does not have an impact on the human perception of a real-time feedback system). The cloud system has a latency of around 0.05-0.1s. Latency can be determined by milliseconds directly on the system if required.
  • Can we select (or have users select) which parts of the body to be identified as the "response target"? Can we set thresholds of movement initiation (translation, flexion, etc) to qualify as a response?"
    Yes, this is possible and free to define. The setting of thresholds is possible as well. The movements in our library are predefined, but you are free to define new ones and set custom thresholds.
  • How can we train this technology to "talk to" the stimuli presented visually to register a "correct/make" or "incorrect/miss" response?"
    The grading is up to you in general, but we would provide full support with the implementation, with correct being more about if the response to the stimulus is done correctly, and with miss is more about the timing.
  • How accurate is the response time of the technology? If a user's body initiates one direction, but then controls their posture to go the other direction, which response is recorded?"
    Both reactions can / will be recorded. The grading/feedback is up to you, we can provide whatever is required.
  • Could average response time and average "return time" (time to return to pre-determined starting position) be presented across a "set"/end of the training session?"
    Yes, this is no problem. All data would be available.
  • Can the camera-registered feedback mechanisms be a complementing factor in our system, alongside things such as audio feedbacks and voice-detected responses?"
    Yes.

Frequently Asked Questions

Price tab
bottom of page