LandXML can accommodate irregular TINS as well as vector points and lines so it has it’s place, however like any XML it’s a very inefficient vector file format heavily bloated with ascii text which you can’t change. This requires a lot of processing resources to interpret and particularly if your surface points are irregular. So not ideal for mobiles and probably a good reason for the size limitation.
A regular surface matrix only (with no points and lines) in raster binary format is much more efficient but in this case you don’t have the option.
You have a couple of choices, if you want to maintain the current resolution you could split your surface down into a lot of smaller chunks.
Or if you could live with resampling the surface into a coarser resolution the file size will come down considerably.
As a guide here’s an example area I just exported from Global Mapper with a 0.5m interval resulting in 139MB, and then after resampling to 1.0m intervals reducing to 34MB. And as a comparison only you can see the same regular surface at 0.5m interval as a binary GEOTIFF is only 3MB.
@Wombo is correct. We set the limit to 25MB as rendering and processing them on mobile phones would require a huge computational power. Resampling the data and using a smaller resolution would help in this case.
I’ll take note of increasing the file size limit as a feature request.
The other and perhaps better alternative for a feature request is for Emlid Flow 360 to conduct the resampling
The workflow would be to submit a LandXML which is too big - such as my example and for Emlid Flow 360 to resample on import.
Not knowing the file format of LandXML I don’t know whether that’s feasible…or whether Studio is the better place for that re-sampling. However the background conversion features in Studio are so nice, that something similar for LandXML would be perfect.
What is your base data? A raster or vector points? I use QGIS daily and if starting from a raster, I would just resample the raster to the final GSD before vectorizing it. If I was going from points, I would rasterize it, resample, then revectorize it.
How many acres is that surface? I’ve worked on 100’s of acres and never seen one that big. Sounds like it needs some optimization. Have you thought about breaking it into zones?
What content is in the file? You really only need the perimeter and trimesh. I think they made a change that you may not need the perimeter and it will represent just the triangles present. You don’t really need points, contours, breaklines and etc… If you want to see contours I wonder if that could be pulled through as a shapefile?
The base data is an orthophoto vectorised to contours then to TIN mesh then to LandXML. This process because mesh creation AFAIK proceeds only from a vector and LandXML creation proceeds only from a mesh…
The issue I think is that the TIN mesh creation involves interpolation as best I understand it and appears not to offer great control of that process. I think it’s in creation of the mesh, and the interpolation process, that the problems begin.
So I generated data from a new site today. I clipped a very small area of the site (6212 square metres) and generated a TIN - and following @michaelL advice generated only vertices.
Despite several attempts it appears that the “floor” for LandXML file size is 37MB.
When you say 0.5m resolution do you mean the file is basically gridded at 0.5m or is that an average and small grade breaks like swales and curb are preserved? It will be interesting to hear @MikeH resolution requirements because that is way too sparse for what we do. Our models are configured for Survey layout so there can be a lot of fine detail required in areas for accurate cut/fill analysis.