All Projects → IBM → Watson-Unity-ARKit

IBM / Watson-Unity-ARKit

Licence: Apache-2.0 license
# WARNING: This repository is no longer maintained ⚠️ This repository will not be updated. The repository will be kept available in read-only mode.

Programming Languages

C#
18002 projects

Projects that are alternatives of or similar to Watson-Unity-ARKit

watson-discovery-sdu-with-assistant
Build a Node.js chatbot that uses Watson services and webhooks to query an owner's manual
Stars: ✭ 20 (-16.67%)
Mutual labels:  ibm-cloud, ibmcode, watson-assistant
watson-waste-sorter
Create an iOS phone application that sorts waste into three categories (landfill, recycling, compost) using a Watson Visual Recognition custom classifier
Stars: ✭ 45 (+87.5%)
Mutual labels:  ibm-cloud, ibmcode
alexa-skill-watson-assistant
Alexa Skill using IBM Watson Assistant and IBM Cloud Functions
Stars: ✭ 72 (+200%)
Mutual labels:  ibmcode, watson-assistant
watson-discovery-ui
Develop a fully featured Node.js web app built on the Watson Discovery Service
Stars: ✭ 63 (+162.5%)
Mutual labels:  ibm-cloud, ibmcode
vr-speech-sandbox-cardboard
WARNING: This repository is no longer maintained ⚠️ This repository will not be updated. The repository will be kept available in read-only mode.
Stars: ✭ 27 (+12.5%)
Mutual labels:  watson-speech, ibmcode
speech-to-text-code-pattern
React app using the Watson Speech to Text service to transform voice audio into written text.
Stars: ✭ 37 (+54.17%)
Mutual labels:  watson-speech, ibm-cloud
dnn-object-detection
Analyze real-time CCTV images with Convolutional Neural Networks
Stars: ✭ 93 (+287.5%)
Mutual labels:  ibm-cloud, ibmcode
gdpr-fingerprint-pii
Use Watson Natural Language Understanding and Watson Knowledge Studio to fingerprint personal data from unstructured documents
Stars: ✭ 49 (+104.17%)
Mutual labels:  ibm-cloud, ibmcode
watson-vehicle-damage-analyzer
A server and mobile app to send pictures of vehicle damage to IBM Watson Visual Recognition for classification
Stars: ✭ 62 (+158.33%)
Mutual labels:  ibm-cloud, ibmcode
fb-watson
Hands-on developing an application using IBM Watson services with Facebook Messenger integrated through serverless functions
Stars: ✭ 19 (-20.83%)
Mutual labels:  ibm-cloud, watson-assistant
watson-discovery-food-reviews
Combine Watson Knowledge Studio and Watson Discovery to discover customer sentiment from product reviews
Stars: ✭ 36 (+50%)
Mutual labels:  ibm-cloud, ibmcode
watson-speech-translator
Use Watson Speech to Text, Language Translator, and Text to Speech in a web app with React components
Stars: ✭ 66 (+175%)
Mutual labels:  watson-speech, ibm-cloud
watson-multimedia-analyzer
WARNING: This repository is no longer maintained ⚠️ This repository will not be updated. The repository will be kept available in read-only mode. A Node app that use Watson Visual Recognition, Speech to Text, Natural Language Understanding, and Tone Analyzer to enrich media files.
Stars: ✭ 23 (-4.17%)
Mutual labels:  watson-speech, ibmcode
web-voice-processor
A library for real-time voice processing in web browsers
Stars: ✭ 69 (+187.5%)
Mutual labels:  voice-commands
awesome-ibmcloud
A curated list of awesome IBM Cloud SDKs, open source repositories, tools, blogs and other resources.
Stars: ✭ 77 (+220.83%)
Mutual labels:  ibm-cloud
lyrebird-slack-integration
Send voicified messages on Slack using your vocal avatar!
Stars: ✭ 31 (+29.17%)
Mutual labels:  voice-commands
nodejs-microservice
WARNING: This repository is no longer maintained ⚠️ This repository will not be updated.
Stars: ✭ 18 (-25%)
Mutual labels:  ibm-cloud
jpetstore-kubernetes
Modernize and Extend: JPetStore on IBM Cloud Kubernetes Service
Stars: ✭ 21 (-12.5%)
Mutual labels:  ibm-cloud
voice-command
A simple no-API voice command assitant
Stars: ✭ 52 (+116.67%)
Mutual labels:  voice-commands
agile-tutorial
A tutorial for agile development of cloud applications.
Stars: ✭ 16 (-33.33%)
Mutual labels:  ibm-cloud

WARNING: This repository is no longer maintained ⚠️

This repository will not be updated. The repository will be kept available in read-only mode.

Build an AI Powered AR Character in Unity with AR Foundation

This pattern was originally published using ARKit and only available on iOS. With Unity introducing AR Foundation, this pattern can now run on either ARKit or ARCore depending on what device you build for.

In this Code Pattern we will use Assistant, Speech-to-Text, and Text-to-Speech deployed to an iPhone or an Android phone, using either ARKit or ARCore respectively, to have a voice-powered animated avatar in Unity.

Augmented reality allows a lower barrier to entry for both developers and end-users thanks to framework compatibility in phones and digital eyewear. Unity's AR Foundation continues to lower the barrier for developers, allowing a single source code for a Unity project to take advantage of ARKit and ARCore.

For more information about AR Foundation, take a look at Unity's blog.

When the reader has completed this Code Pattern, they will understand how to:

  • Add IBM Watson Speech-to-Text, Assistant, and Text-to-Speech to Unity with AR Foundation to create an augmented reality experience.

"diagram"

Flow

  1. User interacts in augmented reality and gives voice commands such as "Walk Forward".
  2. The phone microphone picks up the voice command and the running application sends it to Watson Speech-to-Text.
  3. Watson Speech-to-Text converts the audio to text and returns it to the running application on the phone.
  4. The application sends the text to Watson Assistant. Watson assistant returns the recognized intent "Forward". The intent triggers an animation state event change.
  5. The application sends the response from Watson Assistant to Watson Text-to-Speech.
  6. Watson Text-to-Speech converts the text to audio and returns it to the running application on the phone.
  7. The application plays the audio response and waits for the next voice command.

Included components

Featured technologies

  • Unity: A cross-platform game engine used to develop video games for PC, consoles, mobile devices and websites.
  • AR Foundation: A Unity package for AR functionality to create augmented reality experiences.

Steps

  1. Before you begin
  2. Create IBM Cloud services
  3. Building and Running

1. Before You Begin

2. Create IBM Cloud services

On your local machine:

  1. git clone https://github.com/IBM/Watson-Unity-ARKit.git
  2. cd Watson-Unity-ARKit

In IBM Cloud:

  1. Create a Speech-To-Text service instance.
  2. Create a Text-to-Speech service instance.
  3. Create an Assistant service instance.
  4. Once you see the services in the Dashboard, select the Assistant service you created and click the Launch Tool. "Launch Tool Button"
  5. After logging into the Assistant Tool, click Create a Skill. "Create a Skill Button"
  6. Click Create skill button. "Create skill button
  7. Click "Import skill".
  8. Import the Assistant voiceActivatedMotionSimple.json file located in your clone of this repository.
  9. Once the skill has been created, we'll need to add it to an Assistant. If you have opend your skill, back out of it. Click Assistants.
  10. Click Create Assistant.
  11. Name your assistant, click Create assistant.
  12. Click Add dialog skill to add the skill you just imported to this Assistant.
  13. Click the ... menu in the top and click "Settings" to see the Assistant Settings. "Assistant Skills menu"
  14. Click API Details and find your Assistant Id. You will need this in the next section.

3. Building and Running

Note: This has been compiled and tested using Unity 2018.3.0f2 and Watson SDK for Unity 3.1.0 (2019-04-09) & Unity Core SDK 0.2.0 (2019-04-09).

Note: If you are in any IBM Cloud region other than US-South/Dallas you must use Unity 2018.2 or higher. This is because Unity 2018.2 or higher is needed for TLS 1.2, which is the only TLS version available in all regions other than US-South.

The directories for unity-sdk and unity-sdk-core are blank within the Assets directory, placeholders for where the SDKs should be. Either delete these blank directories or move the contents of the SDKs into the directories after the following commands.

  1. Download the Watson SDK for Unity or perform the following:

git clone https://github.com/watson-developer-cloud/unity-sdk.git

Make sure you are on the 3.1.0 tagged branch.

  1. Download the Unity Core SDK or perform the following:

git clone https://github.com/IBM/unity-sdk-core.git

Make sure you are on the 0.2.0 tagged branch.

  1. Open Unity and inside the project launcher select the Open button.
  2. If prompted to upgrade the project to a newer Unity version, do so.
  3. Follow these instructions to add the Watson SDK for Unity downloaded in step 1 to the project.
  4. Follow these instructions to create your Speech To Text, Text to Speech, and Watson Assistant services and find your credentials using IBM Cloud

Please note, the following instructions include scene changes and game objects have been added or replaced for AR Foundation.

  1. In the Unity Hierarchy view, click to expand under AR Default Plane, click DefaultAvatar. If you are not in the Main scene, click Scenes and Main in your Project window, then find the game objects listed above.
  2. In the Inspector you will see Variables for Speech To Text, Text to Speech, and Assistant. If you are using US-South or Dallas, you can leave the Assistant URL, Speech to Text URL, and Text To Speech URL blank, taking on the default value as shown in the WatsonLogic.cs file. If not, please provide the URL values listed on the Manage page for each service in IBM Cloud.
  3. Fill out the Assistant Id, Assistant IAM Apikey, Speech to Text Iam Apikey, Text to Speech Iam Apikey. All Iam Apikey values are your API key or token, listed under the URL on the Manage page for each service.

"Unity Editor enter credentials"

Building for iOS

Build steps for iOS have been tested with iOS 11+ and Xcode 10.2.1.

  1. To Build for iOS and deploy to your phone, you can File -> Build Settings (Ctrl + Shift +B) and click Build.
  2. When prompted you can name your build.
  3. When the build is completed, open the project in Xcode by clicking on Unity-iPhone.xcodeproj.
  4. Follow steps to sign your app. Note - you must have an Apple Developer Account.
  5. Connect your phone via USB and select it from the target device list at the top of Xcode. Click the play button to run it.
  6. Alternately, connect the phone via USB and File-> Build and Run (or Ctrl+B).

Building for Android

Build steps for Android have been tested with Pie on a Pixel 2 device with Android Studio 3.4.1.

  1. To Build for Android and deploy to your phone, you can File -> Build Settings (Ctrl + Shift +B) and click Switch Platform.
  2. The project will reload in Unity. When done, click Build.
  3. When prompted you can name your build.
  4. When the build is completed, install the APK on your emulator or device.
  5. Open the app to run.

Links

Troubleshooting

AR features are only available on iOS 11+ and can not run on an emulator/simulator. Be sure to check your player settings to target minimum iOS device of 11, and your Xcode deployment target (under deployment info) to be 11 also.

In order to run the app you will need to sign it. Follow steps here.

Mojave updates may adjust security settings and block microphone access in Unity. If Watson Speech to Text appears to be in a ready and listening state but not hearing audio, make sure to check your security settings for microphone permissions. For more information: https://support.apple.com/en-us/HT209175.

You may need the ARCore APK for your Android emulator. This pattern has been tested with ARCore SDK v1.9.0 on a Pixel 2 device running Pie.

Learn more

  • Artificial Intelligence Code Patterns: Enjoyed this Code Pattern? Check out our other AI Code Patterns.
  • AI and Data Code Pattern Playlist: Bookmark our playlist with all of our Code Pattern videos
  • With Watson: Want to take your Watson app to the next level? Looking to utilize Watson Brand assets? Join the With Watson program to leverage exclusive brand, marketing, and tech resources to amplify and accelerate your Watson embedded commercial solution.

License

This code pattern is licensed under the Apache Software License, Version 2. Separate third party code objects invoked within this code pattern are licensed by their respective providers pursuant to their own separate licenses. Contributions are subject to the Developer Certificate of Origin, Version 1.1 (DCO) and the Apache Software License, Version 2.

Apache Software License (ASL) FAQ

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].