# Processing point cloud data - Beagle use Hadoop map-reduce for processing point cloud data, read data from HDFS, generate LOD PCD. - Old documentation [confluence - Beagle data storage model and implimentation](https://conworth.atlassian.net/wiki/spaces/CONWORTH/pages/244383745/B-Eagle+s+Data+Storage+Model+Implementation) ## Procedure: ### Upload file to HDFS: - For text file you can upload to HDFS directly. See [how to upload](./storing_data.md) - For las/laz file you have to split your data first ( <4000000 points for each), then upload splitted data. - Run hadoop map-reduce job #### Split files - You can use lassplit to split las/laz files ```bash lassplit "D:\\*.las" -merged -split 4000000 -o ./.las ``` ### Run hadoop map-reduce job ``` ![beagle tiler parameters](//_static/_images/beagle_tiler_params.png) ```bash /hadoop/bin/hadoop jar /hadoop/share/hadoop/beagle/beagle-tiler-2.0.jar com.beagle.tiler.task.BTask_Tiler2 -u m -so false -sp 0.0001 -sa POISSON -i hdfs://165.132.137.159:9000/beagle/Test.bprj/data/SDV_disseration/pcd_skew_10m.pts -sc true -bs 128 ```