قالب وردپرس درنا توس
Home / Tips and Tricks / How Audrey Spencer, maker of Snapchat lenses, bends reality on your smartphone «Next Reality

How Audrey Spencer, maker of Snapchat lenses, bends reality on your smartphone «Next Reality

The rapid emergence of Lens Studio as a platform for easily developing augmented reality experiences is just one indicator that immersive computing is becoming the norm.

But a layer above the technology tools are the users, the real stars of each platform. Without passionate users, your platform is really just a great idea without the footprint of real-world usage and momentum. That’s part of the reason we decided to take a closer look at some of the people driving the most common version of AR forward through mobile apps.

Don’t Miss: Bernie Sanders Presidential Inauguration Freezing Moment Spawns Meme-Worthy Snapchat Lens

The first is Audrey Spencer, a prolific maker of Snapchat lenses (you can follow her on Snapchat here) who started out years ago as a casual user and eventually developed some of the most engaging AR experiences in the app. That experience led Spencer to undertake research to take her work to the next level by gaining experience in the world of Unity.

Image via Audrey Spencer

In addition to Spencer’s work in smartphone-based AR, she is also currently working with San Francisco-based AR startup Kura Technologies as lead industrial designer. If you want to get a peek at one of the most active minds leading AR mainstreaming, start here, this is just the first of many such explorations with AR creators.

Next reality: Where are you from? And what was your first foray into any kind of immersive media, be it VR, AR, mobile AR, whatever.

Audrey Spencer: I’m from the Boston area 20 minutes outside of Boston. I went to school (Massachusetts College of Art and Design) for industrial design, but I also did a lot of video work and audio as well as a lot of multimedia projects. Storytelling is what has driven me throughout my career, be it industrial design or making things for the internet. In 2014 I started creating content on Snapchat. It wasn’t AR or VR, but you know, mobile stuff, using the drawing tools and some other tools to create stories in a different way.

Next reality: That’s around the same time I started working with Snapchat. Did you play with a different kind of social media?

Spencer: I was very used to drawing on my screen on my phone, which led to drawing in Snapchat. I didn’t really like social media that much. I had a Facebook account, I didn’t actually post to it, but there was just something about Snapchat’s disappearing messages. You could try things out. It was more liberating and less like YouTube where you have to produce everything.

Next reality: Do you remember the first time you saw augmented reality on Snapchat?

Spencer: I think it was around 2016. Maybe it was just like that rainbow with Lens. My first reaction to that was, I wonder what else I could make it track? Some of the filters worked on one of my cats and that was very exciting. That was my first introduction to it. I always thought they were very neat, but how they were made was a bit of a mystery to me. Like, how does it know what my face is?

Next reality: How did you finally figure out what they were doing behind the scenes to get the AR to work in terms of tracking, etc. Before you even touched Lens Studio?

Spencer: That is a good question. So a few of my friends I met while doing illustrations on Snapchat had switched from just creating art and storytelling on Snapchat to making lenses. And they said, “Oh, you’re working in 3D, you should totally do this. It’s all your thing.” It probably took me two years to just play around with it to finally get the hang of it.

Next reality: Talk a little bit more about your experience with Lens Studio. I think you said you first started using it in 2018?

Spencer: I made this video series around 2018. I had this picture of my cat Oscar sitting in front of you. And it looked like a snowball. As if you couldn’t see his body or legs, just like his beard and face. And it looked like a sphere. So I made all these videos of him, just like I was floating in different locations. And then another one where he was an ice cream cone, and then a sphere.

So the first thing I did was I took the 2D image of Oscar as a sphere and placed it on a grounded plane that you could move. Here you can, for example, let your own orbs float in your house. That was my first. Lens Studio was intimidating back then, it wasn’t as easy as it is now. I didn’t know the program, but I did have a starting point to understand it [via experience with Photoshop, Final Cut, and other apps]

Next reality: Your experience seems unique. Most people had no background in non-linear editing and advanced programs such as Adobe After Effects. Do you think it still requires some sort of mental framework in terms of quick, easy AR creation within Lens Studio?

Spencer: I think that now that I’ve started using Unity, I have a great deal of respect for how Lens Studio has used those same tools and poured it into something anyone can use. Some of the more complex stuff requires coding or using their tutorials. But I think at the entry level, if you just want to put a sprite on your forehead and put some particle effects in it, and have some makeup that you want to put on a face, I think it’s a really great tool to people started thinking that way because they walk you through it well in many tutorials, and the pre-built stuff is easy to understand for the most part.

Next reality: I feel like I’ve seen more music artists using mobile AR and less traditional visual artists taking the leap to mobile AR. Specifically, with regard to Lens Studio, I often compare it to Photoshop for AR, so do you have any idea why we’re not seeing even more traditional artists adopting AR?

Spencer: I think it’s because there’s still a mystique surrounding AR … how it’s made, and not knowing how accessible it is. I think what we’ll probably see is more AR work as people get more involved and have a better understanding of the technology.

Image via Audrey Spencer

Next reality: What’s your response regarding Snapchat and how they respond to TikTok AR filters and Facebook’s Spark AR AR tool? What’s your take on the mobile AR space in general? Do you like Spark AR and TikTok?

Spencer: I have a TikTok account, which I haven’t used or posted in several years. I like to watch TikTok videos on Reddit, but I don’t really use the app too much. In Spark terms, actually, my first lens I made that wasn’t just a floating sphere was something I did that was complicated in Spark AR and it was for an HBO Watchman game.

It’s been a long time since I’ve done anything in Spark AR. But one of the big differences for me, at least visually, is that the materials, lighting, and textures in Lens Studio are so much better. I don’t know what they are doing, but everything just looks nicer and better lit. Objects look better and adhere better to the head. In fact, they have much better logic and processing at the back, in my opinion.

Next reality: From a passion standpoint, not technically, what is it about AR that fascinates you? And how do you describe the space when you talk to people who are not very familiar with AR?

Spencer: I think AR is the future of computers. There is so much you can do and it can simplify your life. And as a designer it fascinates me. Communicate with people in AR around the world and just make life smoother. I think that’s the crazy part for me. As a creator of Lens, and like a creator in general, I enjoy creating stories that allow people to interact on their own terms. Instead of just looking at my Snapchat, they can interact with a Lens.

Use this Snapcode to follow Audrey Spencer on Snapchat

Next reality: We spoke before and I was surprised you didn’t seem that optimistic about AR glasses when you talk about the future of AR in terms of everyone walking down the street and using them.

Spencer: I want to make it clear. I am optimistic about it. Mainly because the price will come down and they will get lighter as the technology gets better. And I think they are going to be more like how we think about phones now. We know some of the big problems with AR glasses now facing some of the mainstream companies are (lens) transparency and (image) clarity.

So they tinted the lenses so you can see the images so you can’t really use them outside. And then you no longer see people’s eyes. And then they are not very bright, so you cannot use them outside. And they are heavy. So right now there are a lot of problems. But people will find ways to fix it, including some people I work with. My point is: I’m optimistic, it’s just that the current products need to be improved.

Source link