GENERIC REMOTE: Should Remain Because

I have to say that for mixing, controlling plug-ins and such I also strongly prefer hardware. That’s why there’s no real mixing-board like stuff on our touchscreen. It’s “just” for firing of commands and macros.

Österreicher :smiley:

It’s probably because the entry point is lower- a lot of people know some JS, at least enough to make something of it with the included API.

Ganz Klar! Gotcha…yes, thats why I continued the journey to the ergo.

Österreicher
Understood now!

They all have their strengths and weakness…processing commands for control surfaces; Im not sure if that would be too worriesome; either way js or python would be fine

I don’t know. JS has made some strides and can be found it some Music and Audio related tools, but ES5 is ancient.

Insert diatribe on how JS is on it’s way out, it/s hard to maintain. It is a broken language.

Also because there is a large community that detests JS. It has so much negative energy around it that choosing it was bound to create an uproar.

You can keep JS code running on a web app, because you can update it daily. You can’t do that with this API.

It has the worst performance of any of the options.

I literally started writing a book, so… this is the edit.

1 Like

Can you explain how you setup the tap tempo button and set feature? Are you using mid remote for that or generic remote?

@mike_pol What controller are you planning on using? If its already a listed controller, then all the better.

Using the new remote editor is more the tool for this kind of stuff as its pretty easy once you have your head around it. Secondly is do a search on the pdf help file for it and see if those commands are available yet…as the api grows all the time.

Im mainly mixing in cubase so I dont use the tap tempo or part play…anyone else?

I dont see why the new system should not be able to everything in that can be done with old generic remote. And when it is ready it should be possible to do automatic conversion.

1 Like

It should be able to do more :wink:

but somethings dont need the front end…its muscle memory for a lot of functions…sure faders and knobs but there is a lot more to lightning workflows than that

It can of course not be a bidirectional conversion.

and no need either!
Bur it does need a status bar type of feedback for the basic functions…like Ableton

This is most of my common shortcuts

The rest belong to the hardware knobs/sliders but these are all on one hand (razer keypad)

The new system need to be prepared for midi2.0 and that is bidirectional. And this the fundamental change in midi2.0 vs midi1.x. However it is not mandatory to something useful with it.

@cubace …im not getting you?..cubase has always been ahead of the game since v1.0/atari with midi so thats a given…im sure they already have that sorted…
Im talking about ui/ux which currently doesnt have status bar like ableton which addressed with python within the remote framework and is the ui designers domain ie unrelated midi ( it def doesnt need midi 2 to achieve that)

Python would have been a much better choice for ui than JS. And their scope seem to be to replace GR instead of having a generic programming interface for cubase.

Js and python are both good. Sure they havent adopted the same modularity and framework as Live but thats because of Hamburg vs Berlin :wink:

I think both sucks. They are not good for realtime tasks and I think it would be good if the extension language also was capable do audio processing in a efficient way. A programming language independent interface maybe. I think RUST would have been my pick if there can only be one language.

Sure
But we need to delineate; this is a user remote context. I do oem ui/ux for.midi hardware guys in China…thete has been basically nothing that couldnt be done in python. Audio is done in juce etc within the vst framework which is robust and well developed…lets not blur those lines.
In any case i would use an already existent and developed eg max as per live but there is just no need
I want to make.music…not program…the move to the new remote was a good one…just needs some refine but is fit for purpose…lets not lose sight of the music
Its not what youve got…its how you use it :wink:

VST framework is not that good, at least not in the context of remote controllers.
A VST3 plugin in is supposed to have functions for QC and AI. This is a GUI method and many vendors does not respond to this method. In cubase 12 Steinberg added some workaround in plugin common section. It is very useful, but from a framework design point view it is a failure. And I think one reason is juce that is a framework that can be frontend to VST3 but does not implement VST3 features.

I think it would be good if you could do functions like “move cursor to closest audio zero-crossing point”. VSTs is not the right way to move cursor and macros can’t analyse audio.

I use wavelab for serious editing but id assumed the cubase editor handled that type of thing already as a program function…interesting
Ie move to next 0

In cubase there is a “snap to zero crossing” but I think that is as close as you get.