Hardware Accelerated Video?
Does LMI Central use hardware acceleration for encoding/decoding the video stream, and is the codec published?
I can only really speak anecdotally, but I feel like the CPU and GPU usage attributed to a LMI remote control session have increased maybe within the past ~year. Maybe both CPU and GPU usage on both sides used to be like 8% to 10%, but now they're both 16% to 20%. I notice this on both the host computer as well as the client computer. But I haven't knowingly changed any settings (quality, color, resolution, quantity of monitors, OS, activity on the host computer, etc).
So my (perhaps uninformed) hypothesis was that like maybe LMI used to use H.264, but then maybe recently switched to H.265 or AV1? In my hypothesis, the increased usage I think I'm seeing could be attributed to using a more resource-intensive codec, or a codec which my graphics cards don't support for hardware acceleration.
All of the computers in question are workstation-grade with i7 processor and discrete GPU (and 16GB+ RAM, SSD, 300Mbps+ internet connection, etc). My client computer is a little old, and probably due for replacement. So if I should try to target a new computer/GPU which could support hardware acceleration that would work with LMI, then I can look into that. Likewise, we could target certain GPU hardware as we gradually/normally replace host computers in the office.