unexpected downscaling on 712k multi-tile VT
Posted: Thu Nov 03, 2016 3:04 pm
Hi
I'm currently using the Amplify Texture trial version in Unity 5.4 terrains, and it seems to be a very interesting tool to have when dealing with GIS. I'm interested in that product so I'm trying to evaluate its limits. I'm usually working with multiple 8192x8192 png tiles and a _x#_y#.png naming convention.
The tool works fine for small datasets, but I'm having a hard time when testing with a bigger set wich is composed of a grid of 88 x 64 tiles, each one still being a 8192x8192 png image. This makes a total size of 720896 x 524288 pixels.
The virtualized texture I obtain seems to be greatly downscaled in respect to the original tile resolution. I've set the virtual texture properties to hardware level = Ultra, and a virtual size = 2048K (1024K also tested). I've played wiht the compression quality (=> 100%) and asked for lossless compression.
Since the total number of pixels does not exceed the max resolution limit of 2048k x 2048k, I assumed my grid size is ok. My tiles are in power of two resolution, but not the grid itself, can it be why I get a scaled result ?
I've previously tested with less tiles with no such scaling problem.
I must add that some of the tiles are missing in my data set (not the first tile so the grid can be computed), but Amplify seemed to compute the grid without complaining and render the virtual texture filling out the missing tile with black pixels, wich is fine for me but I wonder if those missing tiles can be confusing for AT (seems unlikely since there was no downscaling for smaller incomplete grids).
I did not compile the project yet, I'm still in the editor but the problem didn't arise before, with smaller datasets, so I suppose there is something wrong in my setting.
My workstation is a win10 PC, with a nVidia GeForce GTX 580 with 3Go VRAM, using Unity 5.4 64bits.
=> May I ask you to confirm the max resolution reachable for one virtualized texture and if that resolution is for each of the 16 allowed vtextures per scene ? What I understand is that the max resolution of one virtualized texture is 2048000 x 2048000 pixels, is that correct ? I also read "Virtually unlimited number of textures", then I suppose an increase in number of (input) textures implies a decrease in (input) textures resolution so the virtualized texture still fits in the 2048k x 2048k limit.
=> Do you have any advice to help me solve the downscaling problem I'm facing with my 88x64 grid of 8k textures ?
Thanks for your time.
Edit :
I've just discovered that for some reason my project was set for 32bits windows, not 64bits.
I'm switching to 64bit, waiting for the import process to do its stuff, then try again the VT generation.
I'm currently using the Amplify Texture trial version in Unity 5.4 terrains, and it seems to be a very interesting tool to have when dealing with GIS. I'm interested in that product so I'm trying to evaluate its limits. I'm usually working with multiple 8192x8192 png tiles and a _x#_y#.png naming convention.
The tool works fine for small datasets, but I'm having a hard time when testing with a bigger set wich is composed of a grid of 88 x 64 tiles, each one still being a 8192x8192 png image. This makes a total size of 720896 x 524288 pixels.
The virtualized texture I obtain seems to be greatly downscaled in respect to the original tile resolution. I've set the virtual texture properties to hardware level = Ultra, and a virtual size = 2048K (1024K also tested). I've played wiht the compression quality (=> 100%) and asked for lossless compression.
Since the total number of pixels does not exceed the max resolution limit of 2048k x 2048k, I assumed my grid size is ok. My tiles are in power of two resolution, but not the grid itself, can it be why I get a scaled result ?
I've previously tested with less tiles with no such scaling problem.
I must add that some of the tiles are missing in my data set (not the first tile so the grid can be computed), but Amplify seemed to compute the grid without complaining and render the virtual texture filling out the missing tile with black pixels, wich is fine for me but I wonder if those missing tiles can be confusing for AT (seems unlikely since there was no downscaling for smaller incomplete grids).
I did not compile the project yet, I'm still in the editor but the problem didn't arise before, with smaller datasets, so I suppose there is something wrong in my setting.
My workstation is a win10 PC, with a nVidia GeForce GTX 580 with 3Go VRAM, using Unity 5.4 64bits.
=> May I ask you to confirm the max resolution reachable for one virtualized texture and if that resolution is for each of the 16 allowed vtextures per scene ? What I understand is that the max resolution of one virtualized texture is 2048000 x 2048000 pixels, is that correct ? I also read "Virtually unlimited number of textures", then I suppose an increase in number of (input) textures implies a decrease in (input) textures resolution so the virtualized texture still fits in the 2048k x 2048k limit.
=> Do you have any advice to help me solve the downscaling problem I'm facing with my 88x64 grid of 8k textures ?
Thanks for your time.
Edit :
I've just discovered that for some reason my project was set for 32bits windows, not 64bits.
I'm switching to 64bit, waiting for the import process to do its stuff, then try again the VT generation.