Skip to content

Copying Fairbid Druid segments from AWS to GCP (postgres to postgres)

Imported from Confluence

Content may be outdated. Verify before following any procedures. View original | Last updated: July 2023

Ticket: DEVOPSBLN-3648

Steps:

  1. Used dbeaver and this query to get needed segments from druid in aws:
SELECT *
FROM public.druid_segments
WHERE datasource ='FairbidSDK' and start like '2023-06-%';
  1. Used dbeaver to export dat from the query to csv file, with the following parameters:
    image-2023-7-7_17-13-44.png

image-2023-7-7_17-14-13.png

Delimeter = |
Quote charachter = + 3. Updated segments location from s3 to gcs:

> cat update2gcs.sh
cp $1 $1.backup-`date +%Y.%m.%d.%H.%M.%S`
sed -i 's/"type":"s3_zip"/"type":"google"/g' $1
sed -i 's/"bucket":"bln-fairbid-druid-production"/"bucket":"gcs-fairbid-agp-druid-old-deep-storage-standard-useast1-prod"/g' $1
sed -i 's/"key":"segments/"path":"segments/g' $1
sed -i 's/,"S3Schema":"s3n"//g' $1
chmod +x update2gcs.sh
./update2gcs.sh aws_fairbid_druid_delimeter_pipe.csv
  1. Imported updated csv file to new postgresql db for druid in gcp with dbeaver

image-2023-7-7_17-17-39.png

Use same parameters as for export:

Delimeter = |
Quote charachter = +

  1. Use druid ui to verify that segments were added and druid was able to download data from the new location
    Screenshot 2023-07-07 at 17.20.24.png