New to Jam Origin - Technology Clarification Questions

Hi Jam Origin,

I am evaluating your products and have a few questions.

VST3 introduced Audio Input for VST Instruments in 2008 - same year Jam Origin was started which I am sure is not a coincidence.
Hence, VST3 Instruments no longer have to be driven by MIDI limitations.
At this point the role of MIDI Guitar becomes pretty blurry to my understanding.

Which format does raw Audio Input get converted into so VST3 Instruments can function at full capacity?? i.e. guessing it wouldn’t make sense for it to be converted into MIDI messages or would it?
Is it up to the DAW or up to VST3 Instrument Plugins to facilitate this conversion??
DAWs do have to facilitate sidechaining though I am not sure if they are also responsible for converting audio input for VST3 Instruments.

This brings me to:
Which role does MIDI Guitar software play in this equation??
Can MIDI Guitar be considered a DAW that facilitates Sidechaining just like any other DAW that supports sidechaining ??
Is it a question of sidechaining latency optimisation then??
Is it a Pitch to MIDI technology or raw Audio Input to VST3 Instrument format technology??

What happens if I sidechain a live audio channel through a VST3 Instrument in a DAW that supports sidechaining (e.g. Cubase 9 onwards etc)?
Will latency be too high or will it be reasonable?
Or is something else different with MIDI Guitar??

Is there a term that delineates MIDI Driven VST3 Instruments from Audio Input Driven VS3 Instruments ?
I see people mixing the two all over the place. There has to be a separate name for each.

Can Midi Guitar operate from 2 physical audio inputs simultaneously? i.e. for 2 guitars processed on one PC/Mac at once?
What about 2 Guitars and one input for Voice?
How many CPU cores are necessary for this?
Does Midi Guitar support VSTi Drums?

There was a comment from 2020 regarding GPU processing not being on the cards. Is it still the case in 2022?

Much Thanks

As I understand MIDI Guitar 2 converts Raw audio into MIDI CC messages

Not sure that the VST3 standard removes any MIDI limitations, it adds MIDI functions that were previously not available under VST 2.

How would you trigger VST 3 instruments if you were not using MIDI or MIDI controller, even MPE controllers convert the information into MIDI CC messages.

Unless I have missed something?

To answer some of my own questions, MG is a Plugin Host that maps audio input into either VST3.5 Note Expression or MIDI MPE depending on which of the two is supported by a particular plugin. Or does MG only support MPE??
MIDI limitations were removed in terms of audio signal sidechaining with VST 3.5 Note Expression API that became available in 2011 with VST 3.5.
MIDI MPE tried to catch up in 2018 though Note Expression remains a broader solution with higher resolution
MIDI 2.0 may finally be able to come significantly closer towards Note Expression and spread into AU/AAX/RE formats
VST 3.7 introduced support for MIDI 2.0 in 2020
Note Expression is a software only solution for VST Instruments and can be expanded well beyond MIDI 2.0 as necessary

As I understand it, MIDI CC or MIDI in general is not necessary for plugins that support Note Expression when VST Instrument is driven by raw audio signal.
“Note Expression” does not come up in searching this forum, hence, I do not think this forum is a good source of information. It probably plays to Jam Origins interests.

“Pitch to MIDI” represents a subset of MPE and does not even consider Note Expression API as it is not MIDI.
“Pitch” i.e. Fundamental Frequency represents only a small aspect of entire audio signal spectrum.
One of the main proponents of MG here keeps pushing “Pitch to MIDI” onto people which is a big red flag.
Once things settle down a bit in the next couple of years it may make some sense to say “Signal to MIDI 2.0”

Anyhow, my questions still stand:
Is sidechaining into VST instruments by most DAWs today done via conversion into MPE and Note Expression?
What are the latencies produced by other DAWS when sidechaining live audio into VST Instruments compared to MG?
A historical question of How were VST3 Instruments driven with audio signal between 2008 and 2011 prior to Note Expression in VST 3.5 ??

Does MG only support MPE or does it also do Note Expression??
Is MIDI 2.0 support coming with MG3 ??

Based on MG3 Roadmap thread it appears MG2 does not support MPE.

VST Note Expression and Audio Input into VST Instruments was introduced in 2011.
Early “VST Expression” was introduced in 2008 and I am guessing it was possible to organise own solution for audio signal conversion into “VST Expression” between 2008 and 2011.

Does Jam Origin even support Note Expression ??
If Jam Origin supported Note Expression, then adding MPE support would be trivial.
Lucky I am only coming into the scene today in 2022.

What in the flying world was Jam Origin doing all these years since 2008 if Note Expression and MPE are not supported??
Any response from Jam Origin ?

“Note expression” is a vague term… and not really a term in any midi message in MIDI 1/2/MPE. But I understand what you mean - those extra dimensions are translated into MIDI (note pressures and channel pressure, bends, and CC expressions, especially CC74 which is generally used as the “third dimension” in synths)

In MG2 you can send CC and expression messages to the synth (within a DAW). It will also extract “Loudness” and “Brightness” signals from your guitar playing and you can map those as well to any MIDI message. If you really want to decouple the guitar pitch and MIDI pitch you can even do that with a MIDI machine (and some coding).

Please tell me if there is anything else you want extracted from the guitar signal.

The limitation in MG2 is that it use MIDI channel 1 for everything, and so if you play a chord, all notes have the same “expression”. This was indeed because of how most synths worked before MPE was introduced.

MIDI 2.0 and MPE is coming with MG v3 (free update) which is still in development and it will make all sorts of “note expression” easier to cable up / program.

Obviously, any latency on the DAW side depends on the DAW and how it is processing its plugins, but that latency its not notable for most setups and i suspect max one buffersize.

Thanks for clarifying that MIDI 2.0 and MPE are coming in MG3.

Note Expression is anything but vague. It’s been part of VST SDK since 2011 - VST 3 SDK: RangeNoteExpressionType Class Reference
Value ranges are 32 bit like with MIDI 2.0 though number of types is unconstrained i.e. more freedom than with MIDI 2.0.
Note Expression was explicitly designed to be independent of MIDI so MIDI will have no reference to it whatsoever. Steinberg has a special place for Note Expression on their “technologies we are proud of” page - Our Technologies | Steinberg

Guitar MODS turns out to be the perfect reference point for my questions below:

  • What can the “proprietary synth engine” of Guitar MODS do that VST Note Expression cannot??
  • Why not just use Note Expression(which is free of MIDI) for Guitar MODS ??
  • Why did Jam Origin start in 2008 if never used VST Note Expression? = puzzling coincidence??
  • Why convert audio to MIDI and pump that MIDI into DAW if Instruments already support Note Expression that is MIDI free and is better than MPE and even better than MIDI 2.0??

Looking forward to MG3. Appreciate your work.

Ah, ok. We are coming from polar opposites here, but that’s interesting!

From a technical point Steinbergs “Note Expression” and MIDI 2.0 is very similar. I don’t see how “Note Expression” is of any technical advantage over MIDI (please educate me, if you know). I think you are reading too much into Note Expression. Its just Steinbergs proprietary MIDI 2.0 and giving it another name is just marketing.

As for choosing MIDI vs Steinberg:

  • MIDI is a non-profit, liberal licensed, very mature and ubiquitous.
  • With all the respect for Steinberg, they are a single company, with VST SDK proprietary licensed and recently set fire on the entire industry with discontinuing VST2 and a new proprietary VST3 license (which was made more approachable after lots of pressure from developers, if I recall correctly). This actually triggered a new CLAP format, alternative to VST3, by U-He, Bigwig and smaller developers.
  • The elephant in the room: Apple (and hence the majority of MG users) use DAWs which doesn’t even support VSTs.

All that said, we have a good dialog with Steinberg and certainly want to continue support their technologies.

MIDI Guitar has of course something like “Note Expression” internally, the actual representation is really not very important to MG or Guitar MODS (the tracking and extracting all sorts of information form the guitar playing is what is important). But it makes it easy enough to expose/reduce this internal representation into “Note Expression” or MIDI 2.0.

At last we are on the same page about Note Expression.
Apple does present a problem, however, most DAWs have MacOS versions with VST support and there are AU/VST wrappers for Logic Pro. In other words, I still do not see any reasons to avoid Note Expression.

I am certainly not a Steinberg evangelist, neither am I their proponent. I am raising these questions to understand Jam Origin’s historical motivations.
Steinberg had their version of MIDI 2.0 over 10 years prior. Why would they call it MIDI 2.0 if back in 2008 they were specifically aiming to circumvent MIDI 1.0 limitations. I wouldn’t call this “marketing”.
Still a little stunned to discover that Note Expression did not come into your experience until now.
How is it even possible?

As for whether Note Expression is better than MIDI 2.0, I can see with a naked eye that Note Expression is more extensible whilst MIDI 2.0 is forever stuck at the spec level until MIDI 3.0 comes along.
Steinberg will be able to give a much better answer and I am 100% certain they have plenty to say.
They should also weigh in on how it translates into practical benefits of driving virtual instruments.
I’d be interested to know as well.
Perhaps the answer will come from the following question:
Why is Guitar MODS necessary if MIDI 2.0 satisfies all your requirements ?
Is it about latency or resolution??
If yes, it brings me back to Note Expression.

Its been a decade since I looked into VST3 spec. MG2 is actually a VST2 because when we built a VST3 there were so many inconsistencies and compatibility issues with various DAWs - depending on the settings, some wouldn’t output MIDI and others caused additional latency. The documentation was terrible and with the additional complexity over VST2, DAW developers just had find their own way so in the end every DAW would do things differently. Its probably better now, but the JUCE framework (which is the most popular framework for plugin developers and MPE stuff) is often pushing VST3 specific fixes in 2022.

But in any case, please let me know if there actually exist a popular synth that make use of Note Expression? Its no big deal for us to add support for it, but only if its actually used by anybody.

Guitar MODS can indeed be a MIDI 2.0 synth, if its driven by MG.

Is sidechaining into VST Instruments(i.e. using audio signal instead of MIDI) a niche use case?
Could this be why Note Expression is generally sidestepped by plugin developers??
Note Expression was designed specifically for this use case.
What I am saying is reasons do not appear to come from compatibility issues(Mac is fine) or sdk licensing issues(I do not see any infringements on developers there).

At the end there may be a favourable resolution to Note Expression support.

PaulWalmsley of Steinberg references a video that has plenty of info on MIDI 2.0 and Note Expression spec differences: Arne Scheffler and Janne Roeper - Support of MIDI2 and MIDI-CI in VST3 instruments - YouTube
Note Expression CC spec vs MIDI 2.0 CC spec comparison Arne Scheffler and Janne Roeper - Support of MIDI2 and MIDI-CI in VST3 instruments - YouTube
Based on the entire video Note Expression is exponentially broader than MIDI 2.0 i.e. it is more like MIDI 3 or 4.
Paul also says:
“if there any plugins that want to support MIDI 2.0 then they will need to implement VST Note Expression in a VST3 plugin (because the tuning support is based on Note Expression)”

Would you be able to confirm through your Steinberg and JUCE channels that Paul’s statement is true and to what extent?

Looks like Steinberg put their foot down and I am in full support of this move.
To answer your question: if above statement is true, then any VST3 with MIDI 2.0 support will be using Note Expression.

Very interesting to read.

As a Cubase user I’d say Steinberg could have done way more to get guitarists and bass players to become Cubase users if they really wanted to.

But there’s sadly no sign of this ever happening.

*Since they developed ”Note Expression” they could have, if they really wanted to, made something similar to Jam Origin’s MIDI Guitar themselves. Right?

*Since Steinberg/Yamaha/Line6 are really one company they could have made 1:1 VST versions of maybe the Yamaha Magicstomp and other Yamaha related guitar stuff.

*They could have also incorporated the modeling that Line6 uses in the Variax but in VST format.

And so on and so on…

Strange business strategy if you ask me… :wink:

No. “Note expression” has nothing to do with audio recognition or making guitar effects.

In general there are no bottlenecks or issues in terms of encoding and transporting musical information among DAWs and plugins and hardware. “Note Expression” is just an API and encoding of musical data, similar to MIDI, but proprietary to Steinberg. MIDI Guitar also has its own “Note Expression” API but, just like with Steinberg, it doesn’t really add anything to users (apart from the possibility of Lua scripting).

What you really want is guitar tracking system with some kind of multi-dimensional feature extraction that maps over to all the great new MIDI MPE synths and DAWs with first class MPE support. MGv3 is exactly this.

Steinberg did add Sidechaining into VST Instruments to Cubase 9 in 2016 which evolved further in version 11 and also in the current Cubase 12.
Other DAWs also support sidechaining.

This raises some key questions:
Is Cubase’s Sidechaining(aka multi-dimensional feature extraction or “tracking”) Monophonic or Polyphonic??
Can a VSI Instrument be effectively driven by a Guitar input with Cubase’s Internal “tracking”/sidechaining??
What are the limitations of this today?
If Cubase sidechaining is Monophonic only, does it work well with Hex pickups and interfaces like Boss GP10(i.e. with 6 separate monophonic channels) ??
What are other DAWs limitations in this respect?

What we really want is a choice of multiple DAWs capable of “tracking” independently.
Wonderful if JamOrigin happens to be the top polyphonic “tracker” if it is capable of doing so (how does it compare to Steinberg’s sidechaining today??).

What we also want are synths based on MIDI 2.0(which presumably includes Note Expression support) as opposed to MPE although MPE is a start.
Is there a regularly updated page with plugins sorted by MPE and MIDI 2.0 support ?

No. Sidechaning also have nothing to do with tracking. It might be used for triggers, but that’s just about useless in the context of polyphonic audio.

Turns out most DAWs will convert audio to MIDI to varying degrees of success. Mostly monophonic. Melodyne seems like a good option for polyphonic.
Trouble is none are capable of Real Time conversion i.e. everything is done with pre recorded tracks.

Jam Origin does stand out from the crowd in this area.
Did nobody really come out with a competing Real Time polyphonic solution in the past 10 years?
Are there compact DSP boards that will do flawless polyphonic conversion?

CLAP format did officially get a release in June. With a bit of luck its audio conversion spec will expand to the level of Note Expression. Hopefully CLAP’s MIT license will be sufficient to displace VST.
Suppose it is safe to move on from VST now that parts of Note Expression are integrated into MIDI 2.0.
I will question Steinberg as to why they created Note Expression without providing a rich polyphonic(they do mono) conversion engine. Seems like an architectural flaw.

Thanks for being patient JamO. Wish answers I was looking for were closer to the surface.
Looking forward to MG3.
Will it come out in 2022 ?