According to your screenshot, you record with a bitrate of 20000 kbit per second.
35 minutes is 35 * 60 seconds = 2100 seconds
20000 kbit per second is 20000/8 = 2500 kbyte per second
2500 kbyte per second is 2500/1000 = 2.5 mbyte per second
So 2100 seconds with 2.5 mbyte per second is
2100 seconds * 2.5 mbyte/second = 52500 mbytes = 5.2 GB
That is what you have.
A 1.5 hour video with a size of 1 GB has a bitrate of 1 GB/1.5 hours
1.5 hours is 1.5 * 3600 seconds = 5400 seconds
1 GB is 1000 MB is 1000000 KB = 1000000 * 8 kbit = 8000000 kbit
That means a bitrate of
8000000 kbit/ 5400 seconds = 1481 kbit per second
Thus, this video was probably recorded with a bitrate of 1500.
A bitrate of 20000 produces video very good quality,
a bitrate of 1500 produces video with mediocre quality.
So, what is your issue now?