PostgresOSM.import_subregion_osm_pbf¶
-
PostgresOSM.
import_subregion_osm_pbf
(subregion_names, data_dir=None, update_osm_pbf=False, if_exists='replace', chunk_size_limit=50, parse_raw_feat=False, transform_geom=False, transform_other_tags=False, pickle_pbf_file=False, rm_osm_pbf=False, confirmation_required=True, verbose=False, **kwargs)¶ Import data of geographic region(s) that do not have (sub-)subregions into a database.
- Parameters
subregion_names (str or list or None) – name(s) of geographic region(s)
data_dir (str or None) – directory where the PBF data file is located/saved; if
None
(default), the default directoryupdate_osm_pbf (bool) – whether to update .osm.pbf data file (if available), defaults to
False
if_exists (str) – if the table already exists, to
'replace'
(default),'append'
or'fail'
chunk_size_limit (int) – threshold (in MB) that triggers the use of chunk parser, defaults to
50
; if the size of the .osm.pbf file (in MB) is greater thanchunk_size_limit
, it will be parsed in a chunk-wise wayparse_raw_feat (bool) – whether to parse each feature in the raw data, defaults to
False
transform_geom (bool) – whether to transform a single coordinate (or a collection of coordinates) into a geometric object, defaults to
False
transform_other_tags (bool) – whether to transform
'other_tags'
into a dictionary, defaults toFalse
pickle_pbf_file (bool) – whether to save the .pbf data as a .pickle file, defaults to
False
rm_osm_pbf (bool) – whether to delete the downloaded .osm.pbf file, defaults to
False
confirmation_required (bool) – whether to ask for confirmation to proceed, defaults to
True
verbose (bool or int) – whether to print relevant information in console, defaults to
False
kwargs – optional parameters of
.import_osm_pbf_layer()
Examples:
>>> import os >>> from pyhelpers.dir import cd >>> from pyhelpers.store import load_pickle >>> from pydriosm.ios import PostgresOSM >>> osmdb_test = PostgresOSM(database_name='osmdb_test') Password (postgres@localhost:5432): *** Connecting postgres:***@localhost:5432/osmdb_test ... Successfully. >>> # -- Example 1: Import PBF data of Rutland ----------------------------------- >>> sr_name = 'Rutland' # name of a subregion >>> dat_dir = "tests" # name of a data directory where the subregion data is >>> osmdb_test.import_subregion_osm_pbf(sr_name, dat_dir, rm_osm_pbf=True, ... verbose=True) To import .osm.pbf data of the following geographic region(s) into postgres:***@...: Rutland ? [No]|Yes: yes Downloading "rutland-latest.osm.pbf" to "tests\" ... Done. mporting the data into table "Rutland" ... "points" ... Done: <total of rows> features. "lines" ... Done: <total of rows> features. "multilinestrings" ... Done: <total of rows> features. "multipolygons" ... Done: <total of rows> features. "other_relations" ... Done: <total of rows> features. Deleting "tests\rutland-latest.osm.pbf" ... Done. >>> # -- Example 2: Import PBF data of Victoria and Waterloo --------------------- >>> # The PBF data of Victoria and Waterloo is available on BBBike download server >>> osmdb_test.DataSource = 'BBBike' >>> sr_names = ['Victoria', 'Waterloo'] >>> # Note this may take a few minutes or even longer >>> osmdb_test.import_subregion_osm_pbf( ... sr_names, dat_dir, parse_raw_feat=True, transform_geom=True, ... transform_other_tags=True, pickle_pbf_file=True, rm_osm_pbf=True, ... verbose=True) To import .osm.pbf data of the following geographic region(s) into postgres:***@...: Victoria Waterloo ? [No]|Yes: yes Downloading "Victoria.osm.pbf" to "tests\" ... Done. Parsing "tests\Victoria.osm.pbf" ... Done. Importing the data into table "Victoria" ... "points" ... Done: <total of rows> features. "lines" ... Done: <total of rows> features. "multilinestrings" ... Done: <total of rows> features. "multipolygons" ... Done: <total of rows> features. "other_relations" ... Done: <total of rows> features. Saving "Victoria-pbf.pickle" to "tests" ... Done. Deleting "tests\Victoria.osm.pbf" ... Done. Downloading "Waterloo.osm.pbf" to "tests\" ... Done. Parsing "tests\Waterloo.osm.pbf" ... Done. Importing the data into table "Waterloo" ... "points" ... Done: <total of rows> features. "lines" ... Done: <total of rows> features. "multilinestrings" ... Done: <total of rows> features. "multipolygons" ... Done: <total of rows> features. "other_relations" ... Done: <total of rows> features. Saving "Waterloo-pbf.pickle" to "tests\" ... Done. Deleting "tests\Waterloo.osm.pbf" ... Done. >>> # Since `pickle_pbf_file` was set to be True, >>> # the parsed PBF data have been saved as Pickle files >>> # Data of Victoria >>> victoria_pbf = load_pickle(cd(dat_dir, "Victoria-pbf.pickle")) >>> # Data of the 'points' layer of Victoria >>> victoria_pbf_points = victoria_pbf['points'] >>> victoria_pbf_points.head() id coordinates ... man_made other_tags 0 25832817 POINT (-123.3101944 48.4351988) ... None None 1 25832849 POINT (-123.3162637 48.4336654) ... None None 2 25832953 POINT (-123.3157486 48.4309841) ... None None 3 25832954 POINT (-123.3209478 48.4324002) ... None None 4 25832995 POINT (-123.322405 48.432167) ... None None [5 rows x 12 columns] >>> # Data of Waterloo >>> waterloo_pbf = load_pickle(cd(dat_dir, "Waterloo-pbf.pickle")) >>> # Data of the 'points' layer of Victoria >>> waterloo_pbf_points = waterloo_pbf['points'] >>> waterloo_pbf_points.head() id ... other_tags 0 10782939 ... None 1 10782965 ... None 2 14509209 ... None 3 14657092 ... {'traffic_signals:direction': 'backward'} 4 14657140 ... None [5 rows x 12 columns] >>> # Delete the Pickle files >>> os.remove(cd(dat_dir, "Victoria-pbf.pickle")) >>> os.remove(cd(dat_dir, "Waterloo-pbf.pickle")) >>> # Delete the database 'osmdb_test' >>> osmdb_test.drop_database(verbose=True) To drop the database "osmdb_test" from postgres:***@localhost:5432 ? [No]|Yes: yes Dropping "osmdb_test" ... Done.