Dev Log #3 – Playtesting for feedback and usability in a Mixed Reality space

Hey you! So the last post I said that we reconsidered a lot of our original design choices and the process that led us to where we currently are at in the project. The project has a much more defined concept, which is:

A sandbox mixed reality pet sim where the player can interact with floating blobs by poking, petting, squishing and merging them.

Our game is built around the idea of open exploration: we don’t want to explicitly impose strict goals on the player. Instead, we want to present rules and interactions that the player can use to create their own goals. We currently have a pretty full list of possible interactions, but for the alpha, we only really want to test out three in particular: Picking up, Pushing/poking, and merging/splitting (I mean when  I put it out like that, it’s more like 5). The game is all about player experience, so we have to not only consider how these interactions will work but what kind of feedback it will produce when the player performs these actions. That’s where testing comes into play!

For the last few weeks, our narrative designer and production manager, Jen, and I have been performing focus group tests with people around the school. Our goal has been to help the other departments on our team get an idea of what direction we should go based on feedback from potential users. Essentially, we have filled a bit of a QA role these last few weeks instead of design, but we both decided that it should be fine considering that the bulk of the design work for the alpha has already been done and getting early user data would be beneficial.

Preliminary Designs by Yani Wang
Silhouette Table by Keana Almario and Yani Wang

So what did we subject the poor souls of Sheridan College to in the name of SCIENCE!?! Well, first we did preliminary art tests for our art department. We want the creatures the players are interacting with to facilitate what you can do to them. Balancing that with our goal to always present these creatures as being alive as opposed to objects ended up being tricky. Like, really. How do we get people to squish, push, and smoosh creatures if we tell them they are alive? Our art team, consisting of Keana and Yani, drew up and produced this to help test designs with:

Using the mock-ups, we took my now-deceased laptop (R.I.P Lappy) to the halls of Sheridan and asked them what they thought. During our first tests, we found that most people liked the blobbier, less animalistic designs from the first page, and found that people really liked N, C, and F for the second page. We took these results back to our art team so that they can use the feedback in further designs.

After that set of testing, me and Jen decided to do some “material testing”. I don’t think this could exist in any other environment than doing a mixed reality game, but we felt we needed to get good feedback to get a sense of how our blobs should respond to player interaction. In order to simulate these interactions, we bought objects that resembled the consistency and material of the blobs we wanted to make and took them to school and asked what people thought. These toys included some slime, silly putty, play-dough and a stress ball

And no, we didn’t add googly eyes while we were testing.

We found that the two most popular materials were the slime and silly putty as they were squishier and more fun to play with. Conversely, we found that the stress ball was the least favourite as it was harder to squish. Also, people mentioned they couldn’t really see a creature with the same type of consisentcy as the stress ball as they were tempted to throw it instead of pet it. We recorded the data and gave it to our art team for review.

Art board created by Keana Almario

Lasty, we took another art mock-up sheet and went out to ask students what they thought about the designs.

The mockup was split into rows and columns and we asked students to pick which blob they liked an why. The designs with the white boxes around them were the most popular choices, as they were the ones that gave the most character. Oddly enough, we found that the arms on the blob were very popular, which seemed to contradict the first test where people favoured the more amorphous design. In the end, we came up with a final-ish design for the creatures which looks like this:

Visual Mockup by Keana Almario

So this is all well and good, but you’re here for design? I like to think that testing is part of the design process. We want to make an engaging experience for players, and because of that, we have to be in tune of player reactions as often as we can throughout development. Also, as I said before, we wanted the blob’s design to facilitate how the player can interact with it, making it just as much of a design challenge as an art challenge. Now, where do we go from here? With the semester wrapping up, we don’t have much to do before the alpha comes out, but Jen and I would like to hit the ground running into the new year by prepping for a couple of challenges we know we’re going to face into the new year.

To start, we currently have an issue where if the blobs get too big, they can clip through the players head. This causes the blob to disappear when it gets too big and the player picks it up. We’ve already discussed some possible solutions, such as having the blob burst if you feed it too many other blobs, but that’s just one of the challenges we are foreseeing. Next post, I’ll be talking about how we approached solving these challenges as well as some other cool systems I helped design.

Devlog #2 – Designing virtual interactions in a physical space

Hey you! Last week, I talked about our design challenge and phrased it a bit like this:

How do you design an engaging and immersive game world using the real world around you?

This challenge guided us throughout our early production stages and gave us something to look back at if we found ourselves stuck on a particular problem.

One of these problems was arguably one of our more interesting features.

One concept that went pretty far into production was a monster hunting game similar to Keep Talking and Nobody Explodes. One player would be the hunter and could see the monster and attack it, while the other was the veteran and knew how to defeat the monster based on a manual we’d give them. We tested the concept and found success in making it fun an engaging, but we hit a practical problem; what’s stopping people from playing this game on their own. MR lets users see the game on top of the real world, so one issue we found was that the player fighting the monster can also just hold onto the manual and fight the monster. Another issue we ran into was that for the non-MR player, fighting a monster is more fun than reading a book so is it actually fun to be the non-MR player. The team knew we had to rethink things and
during one of our meetings, I proposed something to the team:

Instead of fighting the monster, what if you were caring for it?

Prior to this meeting, we had talked to our contact at Shadow Factory, Keiran Lovett, and he suggested we should focus on UX. We took this to this meeting and tried to step away from mechanics and goals. The intention was to make the game feel more like a sandbox experience where the focus in more on emergent gameplay as opposed to set rules and win conditions, which is when we landed on the monster pet care game idea. We eventually iterated it further so that the monster turned into multiple spirits, or will-o-wisps, that the player can interact with and play with.

Despite sounding pretty simple, this decision was actually incredibly difficult and came with its own risks. At this point, we were far into the semester with a playable alpha looming over us 6-7 weeks away (not including a week where the majority of the team would be in Montreal) and changing our game could prove to be incredibly risky.  We had a couple of arguments, long awkward silences, and debates over Naruto characters but in the end, we made the call and went forward with our plan to change our concept.

So was it a good call?

It’s certainly too early to say, but the project’s been making considerable progress from that meeting to the time I’m writing this dev log. From that point, we managed to do a bit more testing with the new concept similar to our early testing, including some digital prototypes like the one seen here.

View post on imgur.com

One early parameter we set for ourselves when designing for this game was:

The spirits need to feel like creatures and not objects

What this means is that the spirits need reactions to player interaction. We want to sell the player the idea that when they put on the headset, they are viewing an unseen world that exists on top of our own. In order for this narrative to work, the creatures that reside in this world need to feel alive and responsive. This parameter has helped our design processes and what interactions the player can take with the creatures. 

As I mentioned, our playtesting style is very similar to what we have been doing before. Testing this way is effective because our interface is the player. We need to design a game that makes interactions feel as natural as possible. By doing physical prototyping, we are bound to real-world concepts such as physical space and gravity, grounding our design ideas to them. With that being said,

How do you prototype ghosts?

As far as I know, ghosts aren’t real, and if they are I’m not sure how to acquire them for playtesting. However, balloons are real, and ribbons are real, so combine the two and you get:

Okay, so it’s not perfect, but it did get us the kind of movement we wanted for the wisps, which looks like this

Cool! Now we had an idea of what kind of game the player would be taking part in, the important question that would follow would be what interactions are fun to do with them?

Me and three other team members sat down and brainstormed a list of ideas we would like to explore, which I later transposed into a chart in our GDD that looks like this

Where we are now

Officially we are out of pre-production and production is in full force. We currently have some digital prototypes and we received equipment that allows us to have the freedom of movement we want, so we should be testing that very soon.

As for our next steps, we are looking at exploring more interactions with the wisps and producing digital prototypes we can test externally. We plan on going to malls and testing there. As of now, our artists are currently working on first passes on the creatures we have tested externally with students at the school.