duckspatial is an R package that simplifies the
process of reading and writing vector spatial data (e.g.,
sf
objects) in a DuckDB
database. This package is designed for users working with geospatial
data who want to leverage DuckDB’s fast analytical capabilities while
maintaining compatibility with R’s spatial data ecosystem.
You can install the development version of duckspatial from GitHub with:
# install.packages("pak")
::pak("Cidree/duckspatial") pak
This is a basic example which shows how to set up DuckDB for spatial data manipulation, and how to write/read vector data.
library(duckdb)
#> Warning: package 'duckdb' was built under R version 4.4.3
#> Cargando paquete requerido: DBI
library(duckspatial)
library(sf)
#> Linking to GEOS 3.12.2, GDAL 3.9.3, PROJ 9.4.1; sf_use_s2() is TRUE
First, we create a connection with a DuckDB database (in this case in memory database), and we make sure that the spatial extension is installed, and we load it:
## create connection
<- dbConnect(duckdb())
conn
## install and load spatial extension
ddbs_install(conn)
#> ℹ spatial extension version <76dc6da> is already installed in this database
ddbs_load(conn)
#> ✔ Spatial extension loaded
Now we can get some data to insert into the database. We are creating 10,000,000 random points.
## create n points
<- 10000000
n <- data.frame(
random_points id = 1:n,
x = runif(n, min = -180, max = 180), # Random longitude values
y = runif(n, min = -90, max = 90) # Random latitude values
)
## convert to sf
<- st_as_sf(random_points, coords = c("x", "y"), crs = 4326)
sf_points
## view first rows
head(sf_points)
#> Simple feature collection with 6 features and 1 field
#> Geometry type: POINT
#> Dimension: XY
#> Bounding box: xmin: -138.0885 ymin: -83.68937 xmax: 127.3058 ymax: 65.52595
#> Geodetic CRS: WGS 84
#> id geometry
#> 1 1 POINT (-84.05372 -2.313132)
#> 2 2 POINT (19.89173 -83.68937)
#> 3 3 POINT (13.76448 -63.57522)
#> 4 4 POINT (127.3058 65.52595)
#> 5 5 POINT (-110.9474 40.40336)
#> 6 6 POINT (-138.0885 -71.29385)
Now we can insert the data into the database using the
ddbs_write_vector()
function. We use the
proc.time()
function to calculate how long does it take,
and we can compare it with writing a shapefile with the
write_sf()
function:
## write data monitoring processing time
<- proc.time()
start_time ddbs_write_vector(conn, sf_points, "test_points")
#> ✔ Table test_points successfully imported
<- proc.time()
end_time
## print elapsed time
<- end_time["elapsed"] - start_time["elapsed"]
elapsed_duckdb print(elapsed_duckdb)
#> elapsed
#> 9.64
## write data monitoring processing time
<- proc.time()
start_time <- tempfile(fileext = ".gpkg")
gpkg_file write_sf(sf_points, gpkg_file)
<- proc.time()
end_time
## print elapsed time
<- end_time["elapsed"] - start_time["elapsed"]
elapsed_gpkg print(elapsed_gpkg)
#> elapsed
#> 115.25
In this case, we can see that DuckDB was 12 times faster. Now we will do the same exercise but reading the data back into R:
## write data monitoring processing time
<- proc.time()
start_time <- ddbs_read_vector(conn, "test_points")
sf_points_ddbs #> ✔ Table test_points successfully imported.
<- proc.time()
end_time
## print elapsed time
<- end_time["elapsed"] - start_time["elapsed"]
elapsed_duckdb print(elapsed_duckdb)
#> elapsed
#> 50.34
## write data monitoring processing time
<- proc.time()
start_time <- read_sf(gpkg_file)
sf_points_ddbs <- proc.time()
end_time
## print elapsed time
<- end_time["elapsed"] - start_time["elapsed"]
elapsed_gpkg print(elapsed_gpkg)
#> elapsed
#> 32.38
For reading, we get a factor of 0.6 times faster for DuckDB. Finally, don’t forget to disconnect from the database:
dbDisconnect(conn)