# Store data in HDFS ## Hadoop cluster Beagle hadoop clusters has 8 computers (node) [Hdfs health](http://165.132.137.159:9870/dfshealth.html#tab-datanode) ![Hdfshealth](//_static/_images/dfshealth.png) ## Access HDFS ### Using Hadoop web explorer [Hadoop web explorer]( http://165.132.137.159:9870/explorer.html) ### Using commandline #### Remote access a node in hadoop cluster over ssh ```bash ssh beagle@165.132.137.241 ``` #### Access file ```bash hadoop fs -ls /beagle ``` ![folder beagle](//_static/_images/hsdfs_folder_beagle.png) ### Using http api - http://165.132.137.159:8080/api/api/hdfs?hdfsPath=<> ## Beagle project structure - Each beagle project is located in a folder in HDFS - a beagle project has some folders ![beagle project structure](//_static/_images/beagle_project_structure.png) - 360_photos: panorama images - _origional: contains uploaded data from web - bim: ifc, fbx, ... - data: point cloud data - index: index of point cloud data - meta: meta data of point cloud data - operation: contains output of other map-reduce job (for example: change detection, RANSAC,...) - sample: sample data of point cloud - tiling: LOD PCD data