AR-namils… get it? Ha.
Hi, I’m A.J., the new summer intern for Capitol Interactive. I’m still learning all the tools and software us developers have to know for this industry, but I think this first project was pretty exciting to do on my own.
AR-namils is an Augmented Reality zoo of sorts, where users click a button and the corresponding animal on the button appears in front of him or her in the real world. It’s pretty simple, but it took a lot of learning and research to get it right. Luckily Facebook’s AR Studio has some nifty documentation and tutorials to make it work – as well as some inexpensive 3D assets from The Unity Asset Store. Additionally, they came with idle animations, which is all a first time developer needs.
My primary task was creating an AR experience which uses AR Studio and plane detection, so I started there. First I watched a couple tutorials on the user interface first so I could get used to it. Next, I got the hang of importing the 3D animals with their textures and idle animations, and placing them down on the plane, and testing on my phone to make sure the plane detection was functioning properly. Unfortunately, some of my favorite animals didn’t have proper idle animations so I had to use the next best trio – a shark, penguin, and rabbit. Unorthodox for sure, but workable.
Next, I added 3 square buttons of different colors to represent the 3 different animals to the bottom of the screen. Then I realized “what is user interface” and added the text “Just tap the button, man” to the top of the screen. I also added a small description text box above the buttons.
Then the hard part came – patch editor programming. I had no idea where to begin other than try some combinations that seemed like they might work. First I tried to allow a button press to switch on and off the visibility of the animal and text box. Each button would toggle a specific animal and text box for that animal, and ideally they would have to be toggled off before the user clicked another button.
That ended up looking like this when the user clicked all 3 buttons:
Not ideal. Little did I know that there is an Option Switch and Option Picker node built into the Patch Editor. With that knowledge, I did some experimenting trying to patch together different nodes and finally came to a set that would allow fluid switching from one animal to the next once their corresponding button was pressed.
By using those two nodes, I could set values to each button (0-2) and allow only one of the animal options to be visible while the other ones were invisible. My app finally worked as intended. However, it was ugly. It needed a small facelift, as properly criticized by anyone I showed it to using the AR Studio Player App – which extremely came in handy for demoing with my Google Pixel 2, thank god for offline use.
… I replaced the mug object with my own “Penguin_0”, which allowed the user to select any object and use touch gestures such as tapping and dragging to move it on the plane, or using a pinch gesture to allow resizing of the animals. This interaction was a nice cherry on top of the whole AR experience and made it feel more natural and easy to use.
With the final design in hand, I allowed a few friends and family members to try it out. With the new redesigned interface, they found the app easier to use and enjoyed being able to place the animals anywhere they wanted. Only knowing me as an artist or animator, my friends and family were surprised I created it myself.
Overall, it took only a few days of work and critique to get it right, and I was surprised how easy it was to create using this program. The great feedback and ease of use makes me excited to make more apps with Facebook’s AR Studio. As my first AR app experience, I am proud of it and look forward to creating, and learning more about the program. My hope is that the interns, students, or even professionals that I’ve reached will give this program a chance for their first or next Augmented Reality experience.