Some off the wall, creative graffiti on humanising Interfaces and GA7

First I just love this masterpiece of software engineering Groove Agent 5. I wish I had discovered it before. I hope it becomes an integral part of Cubase for all, fully integrated at every level - I mean living in the Key Editor and in Dorico, engaged with the Tempo track, everywhere you look.

Creativity brainstorm

I think you guys, might know the word Po? De Bono’s creative tool using ideas like he were spilling a deck of cards on a table. No fear, no censorship.

Here we go..

Preamble:

I always thought that there is so little crossover between game technology and our musical stuff - in the field of user interaction. They use protaginists - they address the whole being.

I think a game programmer would make short work of what follows, it’s interesting also, from the perespective of the relationship with GPU programming. I want to make it clear here that I am not proposing anything that will not serve the purpose of music creation, not suggesting we just change the wallpapers to fancy, just using some structures from game theory, in minimal ways, to serve the brief of engaging the user more in Cubase, as a creative protaginist. Helping them remember how things work, letting them grasp things more intuitively by using metaphores.

The Issue:

The problem I want to address is user button fatigue. Button seasickness. The feeling you get when you ask yourself what did I set for the velocity range of the 3rd Crash? Yeah? that was 23-100, but what, what does this mean aurally with this cymbal? Did I apply fx to the group channel in Cubase? Is that it? Why are the highs so high? Now, (ten minute later), where was I on that theme……..? Oh I forgot…

Buttons disorientate. The problem is about finding the right ones and only them. Lots of buttons might be a great thing, when understood - as here in GA5,in the fact they bring power, but they soon overwhelm, and this is where the limit of a program lies, not whith what it can do, but with what the user can comprehend, what the user remembers, what the user can make it do.

What’s Happening?
Users of all applications but particularly in music production are faced by multiple seas, forests, dark rooms, and deserts. They spend increasing time pottering around in the dark, like a blind man with a clicking-stick.

To counteract this, users have become smash and grab merchants, no time to stop. They forget where they have been, because this is one room in a sea of rooms, even within one application. More buttons more menus. It’s another slider, its another button, , its a drop down menu, again, again, again. It’s like UI designers are in a cage. The buttonslidermenu prison.

Yes, buttons do have their place, ansd it would be terrible if htere was not some cisistency, but we can use anything, any metaphore if we desire. The solution is a mix of both.

We can learn a huge amount from the gaming industry. We can consider iur user as a protaganist, represent them on screen,

The key point is all this is minimal but meaningful, in a Cubase kind of way. It;s a more holistic approach, in terms of human experience. As a protaginist, in a game, one invests more and has a more brain inclusive experience - we frequently forget to utilise this when we present data using buttons. I am not saying Cubase traditional cannot also be wholly available. I am saying it can be enhanced by gamification.

PO:

So we build a virtual humanoid, a Cubase protaganist, using very basic game tech, the figure is ghost like and grey very simple. We can call him/her “O”. O, unloaded, is like a faceless spectre, a ghost of a core. a suggestion of a being.

O enters the simulation chamber. This is like the Star Trek holodeck - cubasified. O can see that the ‘huge’ screen before him (AKA Star Trek). It is currently in Project Zone Mode. If he chooses, he can dissolve back, at any time’, to a traditional view but this time he prefers to simply scan the project zone, to orientate himself, to remember his last edits.

O elects to sit on the drum throne. He could have chosen the cello chair, or say a flameco guitarist (VST) he owns. This moves the percussion and rhythm views to the centre of the Holodeck.

Before him is a grey kit in spectre form. As soon as he sits on the throne, the room changes, other choces of instruments dissappear, we are now in “drum sphere”. On the big screen, comes The Kits. He can take his wand and he can pick the whole kit. He can just take a snare from kit 4, and a crash from kit 9, he can meaningfully audition it, both on his own or on by using it with the groove (see further). When the kit is selected, the ghost kit get’s the image of the kit’s items, in colour - the Ghost kit becomes “real”.

Once the kit is built, he switches to another screen. We are Now in the Groove Screen. This window has three simultaneous sub-screens.
On top we see and hear a pulse quietly throb by. It’s a pulsating ‘thread’ of holographic hues, passing by - a kind of multicoloured embroidered twine, that passes past our eyes like a set coloured threads being spun in a spinning wheel. It reflects aurally the project tempo and the main oscilloscopic view (main out) or what is instead selected (think of a screen like the visibility agent), we see a representation of the metronomic beat, perhaps as markers (like hitpoints) promenading pass, integrated into the image of the thread. It has a special Listen mode, where the tracks arev adulturated to give a drumer what they need - aurally. The groove defaults to a skeletal groove, just enough to give the project tempo settings some life.

Below, in the middle view, we see a drum/key editor with 1/2/4 or eight bars, we can see thje notes piano rolling by. We can either hear our other tracks or mute them.

Below this on the same simulated holodeck screen, a Notation window with the same defined bars, grooving.

The invitation here is clearly to hit some beats - yep this is YOU siotting behind that drum kit - YOU are there.

O put’s together a couple of grooves and saves them with gthe project. For the next vibe he fancies a Tam Tam and switches in a conga for the toms. He can search for styles. If he chooses Latin, the visual groove (which is pulsing by at low volume) will take on a sunny hue - respond “emotionally” to him. Maybe for another groove, with lots of hard boppy polyrythms, a New York Street can be selected behind the hues.

SO, O decides one of these groove loops is “golden”. He presses the golden button and it arrives in his project as first (default) version. He can mark his other grooves as other versions and they will be availabkle in the track as groove versions in the same way as track versions, but in the Project Zone.

Seeing all this visually, brings vivacity to the experience of the screen. This helps memorisation and engagement. It makes things meaningful in human terms. Placing the user in the screen, as gamesters, but apparently not music tech creators, know, brings more personal involvement, emotionally. We all know how important emotions are in music. We all nkow how hard it is to keep musically creative in a sea of buttons.

Cubase could be an adventure, with different narratives without losing functionality or becoming gimmicky. I know when I see a groove of colors pulsating visually across a screen I can engage with it emotionally.

This is the Groove Environment on the holodeck.

Help is also at hand, but now in a new way. If O wants he can communicate with his AI collegue H. This can be in the form of a text query or more. AI text queries about Cubase work really well for Cubase in Google already. You can ask Google detailed questions like “what’s the difference between a pattern and a groove in Groove Agent 5 and you get a coherent reply - often with references to a particular utube video, at a certain time point. These replies are often better than the manual because you can refine your query.

H could be present to interact with our protaganist. He can be a technical advice giver, a freind of the protaganist with tech knowledge. A Scotty from Star Treck. Someone that can perform quick tasks too. I ma not talking about a silly Mr Clippit type thing, AI can now make this entity intelligent.


I am not trying to loose Cubase here. Instead I am creating a 3d Shell, envoke more 3d approaches to engaging the user.

Obviously, often best and necessary to view a mixer in an established way, but you can do this right there on the holodeck, like it was the comfort of your living room. It can be a more intimate, more human relatable experience. If a user (loser) is sat in front of a mixer and is panicking there should be help right there in the mixer. One should be able to ask questions like “Why Mix?” and get a coherent AI answer. One should be able to see vsual explanations of how tracks get modified. The data flows. There can be guides, AI, that can be employed to show how tracks can be developed - explain workflows, point out features.

By using game play, in a highly minimilistic way. Only with the intention of making the navigation understandable, in realms outside of “click and dont click”, I think things can stick easier in human understanding. It’s more of a 3D experience.

We need to break moulds. Otherwise with detail, come unmemorisable button forests and with that casual non deep uusage.

This product deserves more.

Z

Instead of calling “him” Po, just call “him” HAL.