Processing point cloud data
Contents
Processing point cloud data#
Beagle use Hadoop map-reduce for processing point cloud data, read data from HDFS, generate LOD PCD.
Old documentation confluence - Beagle data storage model and implimentation
Procedure:#
Upload file to HDFS:#
For text file you can upload to HDFS directly. See how to upload
For las/laz file you have to split your data first ( <4000000 points for each), then upload splitted data.
Run hadoop map-reduce job
Split files#
You can use lassplit to split las/laz files
lassplit "D:\<folder_name>\*.las" -merged -split 4000000 -o ./<file_name>.las
Run hadoop map-reduce job#

```bash
/hadoop/bin/hadoop jar /hadoop/share/hadoop/beagle/beagle-tiler-2.0.jar com.beagle.tiler.task.BTask_Tiler2 -u m -so false -sp 0.0001 -sa POISSON -i hdfs://165.132.137.159:9000/beagle/Test.bprj/data/SDV_disseration/pcd_skew_10m.pts -sc true -bs 128