This is a great idea (allowing for high-bandwidth cabled input of iPhone source) but the current implementation has a couple of large/annoying drawbacks :
-- The iphone is set to full auto mode, which means auto white balance and touch-to-expose being your only exposure option. Your colours will shift back and forth as you broadcast and there's no way to correct this.
-- There's no orientation lock option in the app, which means you need to lock your entire phone in portrait mode and then manually offset the input in OBS by 90 degrees to make it display correctly. If you forget to do this manually, you'll have some ugly moments while broadcasting.
-- The visual feedback on the display is synced to the timing of the stream output, not the output from the camera. This means that as you stream, it drifts slowly out of sync. So, when you reframe for example, it takes the phone screen a second to register the movement and display it. Or, if you're catching a moving subject, you need to "guess" where the subject is in relation to the camera since your visual feedback is delayed.
If these three things could be fixed it would be amazing. As-is it's simply decent but not a comprehensive solution - especially for anyone looking to broadcast semi/professionally.