Processing point cloud data#

Procedure:#

Upload file to HDFS:#

  • For text file you can upload to HDFS directly. See how to upload

  • For las/laz file you have to split your data first ( <4000000 points for each), then upload splitted data.

  • Run hadoop map-reduce job

Split files#

  • You can use lassplit to split las/laz files

lassplit "D:\<folder_name>\*.las" -merged -split 4000000 -o ./<file_name>.las

Run hadoop map-reduce job#

 ![beagle tiler parameters](//_static/_images/beagle_tiler_params.png)
```bash
/hadoop/bin/hadoop jar /hadoop/share/hadoop/beagle/beagle-tiler-2.0.jar com.beagle.tiler.task.BTask_Tiler2 -u m -so false -sp 0.0001 -sa POISSON -i hdfs://165.132.137.159:9000/beagle/Test.bprj/data/SDV_disseration/pcd_skew_10m.pts -sc true -bs 128