Help! Optimizing my Tracking for real time playing on Mac

Hello, I purchased Jam Origin recently under the impression of some highly regarded reviews on its tracking. However, I’m not meeting the expectations I had for how fast the program can track notes. I would like to see if there is anything I am doing wrong. Here is my setup:

2016 Macbook pro
3.3 ghz intel core i7 processor
16 Ram
500gb SSD
OS: High Sierra

Interface: Apollo Duo
Guitar: G&L S500 on bridge pickup

Logic Pro X: Ive used all buffer sizes and the recommended ones from Jam origin and they don’t really do the job well. I also run it in low latency mode. I’ve also adjusted the noise gate constantly which helped, but didn’t do the trick

synths: Ive used JamOrigin’s synths which didn’t track super well and a lot of my arturia sounds and I couldn’t set the parameters perfect. There is always tracking delay even when
I tweaked every parameter just to experiment. I also recorded over tracks to see where the midi is recorded. Its always behind the beat. I have to say, Im kind of disappointed.

Every pitch to midi solution has a latency, which is caused by the wavelenghts involved. It is not possible to build such a system without latency.
When playing with a midi guitarsynth, the situation is very much like with a church organ: you wont hear the organist complaining about the latency allthough with every meter distance roughly 3ms latency is added…

Modes of playing
For guitarsynthplaying there are 2 modes, if you train them, you get better results with any guitar to midi solution.

The 2 modes:

  1. play on guitarattack, use synth as non-attacking addition (e.g. pad sounds, strings)

  2. play on synthattack: disregard the guitarattack, only listen to the synts attack. Ignore the physical attack pulse of your finger/pick.
    This takes training, and is basically the feel churchorganplayers have to deal with. :wink:
    Soft picking styles or pick can help to ignore the (too) early tactile info from your picking hand
    The sync button in MG delays the normal guitarchannel, so that you can have a normal guitar in sync with the synthattack.

1 Like

I see what you’re saying. I play in a gospel church and mess with the organ sometimes. It doesn’t feel quite as latent as this however I do understand how changing my playing can benefit the performance. I’m an extremely percussive player and am trying to bring the soft synths into a more Brazilian flavor, so I can see how this could be a pretty difficult task compared to just playing pads or a line. I guess I’m trying to find the best option for latency even if there is going to be some latency, I’m just hoping there is something more I can do. Like I’m missing something perhaps just besides changing my playing. I’ll try and mess with the synthattack. However the app splits the synth to the left the guitar to the right. How would I make them both left and right or a mono signal?

Thanks for the reply.


on IOS clicking on the “mix” titel will turn it into “split” and vice versa.
on desktop this option is not yet implemented.