Copying Fairbid Druid segments from AWS to GCP (postgres to postgres)¶
Imported from Confluence
Content may be outdated. Verify before following any procedures. View original | Last updated: July 2023
Ticket: DEVOPSBLN-3648
Steps:
- Used dbeaver and this query to get needed segments from druid in aws:
- Used dbeaver to export dat from the query to csv file, with the following parameters:
Delimeter = |
Quote charachter = +
3. Updated segments location from s3 to gcs:
> cat update2gcs.sh
cp $1 $1.backup-`date +%Y.%m.%d.%H.%M.%S`
sed -i 's/"type":"s3_zip"/"type":"google"/g' $1
sed -i 's/"bucket":"bln-fairbid-druid-production"/"bucket":"gcs-fairbid-agp-druid-old-deep-storage-standard-useast1-prod"/g' $1
sed -i 's/"key":"segments/"path":"segments/g' $1
sed -i 's/,"S3Schema":"s3n"//g' $1
chmod +x update2gcs.sh
./update2gcs.sh aws_fairbid_druid_delimeter_pipe.csv
- Imported updated csv file to new postgresql db for druid in gcp with dbeaver
Use same parameters as for export:
Delimeter = |
Quote charachter = +
- Use druid ui to verify that segments were added and druid was able to download data from the new location
