05-14-2018 06:35 AM - last edited on 05-18-2018 12:52 PM by ian.anderson
On a customer's computer, if he loads small lidar data and visualizes it in 3D, gpu usage percentage spikes to 90% and over.
The graphic card is NVIDIA K4200
Is this a normal behavior?
He need to work with huge point cloud data. Are there some parameters to manage the behaviour of the 3D window in order to do noat overload the graphic card?
Solved! Go to Solution.
05-14-2018 07:14 AM
The number of points per tile loaded in the 3D viewer is controlled by a preference. By default 50,000 points are loaded. You can increase/decrease this based on the RAM you have.
Look for the preference "Budget of number of points per tile for display in 3D viewer" in the point cloud category (IMAGINE Preferences -> Viewing -> Point Cloud)