Tomáš Šmerda
This application is a prototype developed as part of a Diploma thesis at Mendel University within the Open Informatics program. It serves as a proof of concept to assess market viability and user engagement, and is not intended for commercial use. The app is designed to demonstrate the potential for further development and integration into the metaverse, focusing on collaborative augmented reality experiences.
Discover a cutting-edge iOS app for collaborative augmented reality experiences. Share and manipulate scenes in real-time, leveraging ARKit, RealityKit, and MultipeerConnectivity. Experience the future of AR with seamless usability and potential for further development in the metaverse.
- MetaCollaboration project is implemented using the Model-View-ViewModel (MVVM) architecture pattern.
- The app uses iOS 16 and SwiftUI 4.0
- iOS 16
- For proper functionality, you need to run the AR Manuals Backend locally and change the correct IP address in the NetworkManager file
This prototype is intended as a starting point for further development. Future iterations could include more robust user interfaces, greater integration with metaverse standards, and improvements in object manipulation techniques to enhance user engagement and utility in professional settings.
private let baseURL = "http://192.168.1.13:8080/api/v3"
Begin by downloading a sample app provided by Apple specifically for ARKit in iOS. This can be found at the following link: Scanning and Detecting 3D Objects.
Utilize the downloaded sample app to scan the 3D object you wish to include in your manual. Once the object is successfully scanned, export it as an .arobject file.
With your .arobject file ready, open the Reality Composer app on an iPad. In the app, select the “Object” type and import your scanned .arobject. Reality Composer allows you to manipulate and annotate your 3D object in a more intuitive way.
Place 3D annotation models around your object at locations that need explanations or highlights. This step involves adding interactive elements to your 3D object, enhancing the instructional value of your manual.
After positioning all annotation models correctly, remove the main .arobject from your project. Then, export the entire project as a USDZ file.
Exporting in USDZ format might require enabling USDZ export in the app's settings on your iPad.
Each USDZ model exported from Reality Composer represents a single step in your AR manual.
Finally, upload the USDZ models along with the .arobject file to your backend through Swagger UI.
To finalize the setup, access your backend through MongoDB Compass. In MongoDB Compass, you'll need to correctly assign the name of the models to each step of your manual. This ensures that the right 3D model is associated with the appropriate instructional step.
Make sure that each step in the manual has its own unique identifier to maintain order and consistency in the AR manual and to ensure that the iOS app works properly