Exporting Data in HarborDB
HarborDB provides powerful export capabilities to save your query results in various formats. Whether you need to share data with colleagues, import into other applications, or archive results for later analysis, this guide covers everything you need to know about exporting data efficiently and securely.
Export Formats Overview
HarborDB supports multiple export formats, each suited for different use cases:
CSV (Comma-Separated Values)
- Best for: Spreadsheets (Excel, Google Sheets), data import/export, basic data exchange
- Features: Custom delimiters, headers, text qualifiers, encoding options
- Limitations: No support for hierarchical data, limited data type preservation
JSON (JavaScript Object Notation)
- Best for: Web applications, APIs, NoSQL databases, data interchange
- Features: Pretty printing, compact formatting, preserve data types
- Limitations: Larger file size, not ideal for spreadsheets
Excel (Coming Soon)
- Best for: Business reporting, formatted spreadsheets, complex data presentation
- Features: Multiple sheets, formulas, formatting, charts
- Status: Planned for future release
Basic Export Process
Step 1: Prepare Your Data
- Execute a query or select a table in the sidebar
- Review the results to ensure they contain the data you want to export
- Consider filtering if you only need specific rows or columns
Step 2: Initiate Export
There are several ways to start an export:
From Query Results
- Click the Export button (š¤) in the results toolbar
- Or right-click anywhere in the results grid
- Select "Export Results" from the context menu
From Table Navigation
- Right-click a table in the sidebar
- Select "Export Data"
- Choose export scope:
- Entire table: All data (use with caution on large tables)
- First N rows: Limited sample (recommended for testing)
- Custom query: Write specific SELECT statement
Keyboard Shortcuts
ā + E: Quick export dialogā + Shift + E: Export with last used settings
Step 3: Configure Export Settings
The export dialog provides configuration options:
General Settings
- File Name: Auto-generated or custom name
- Save Location: Choose folder destination
- Include Headers: Column names as first row (CSV)
- Encoding: UTF-8 (recommended), Latin-1, or system default
CSV Specific Options
- Delimiter: Comma, semicolon, tab, pipe, or custom character
- Text Qualifier: Quotes, double quotes, or none
- Line Endings: macOS/Linux (LF), Windows (CRLF), or system default
- Null Representation: Empty string, "NULL", or custom text
JSON Specific Options
- Format: Pretty (human-readable) or Compact (minimized)
- Array Format: Array of objects or object with data property
- Include Metadata: Query execution info, column types, timestamps
Step 4: Execute and Verify
- Click "Export" to save the file
- Monitor progress for large exports
- Verify file size and content
- Open exported file to confirm formatting
Advanced Export Techniques
Exporting Specific Data Subsets
Column Selection
Export only specific columns by modifying your query:
-- Instead of SELECT *
SELECT customer_id, email, last_purchase_date
FROM customers
WHERE active = true;
Row Filtering
Use WHERE clauses to limit exported data:
-- Export recent data only
SELECT * FROM orders
WHERE order_date >= CURRENT_DATE - INTERVAL '30 days';
Aggregated Data
Export summarized results:
-- Export daily sales totals
SELECT
DATE(order_date) as day,
COUNT(*) as order_count,
SUM(total_amount) as daily_revenue
FROM orders
GROUP BY DATE(order_date)
ORDER BY day DESC;
Batch Exporting
For large datasets that exceed memory limits:
Chunked Exports
-- Export in batches of 10,000 rows
SELECT * FROM large_table
ORDER BY id
LIMIT 10000 OFFSET 0;
-- Then increment OFFSET for next batch
SELECT * FROM large_table
ORDER BY id
LIMIT 10000 OFFSET 10000;
Server-Side Cursors
Enable server-side cursors in Preferences ā Performance for streaming large result sets without memory issues.
Scheduled Exports (Advanced)
While HarborDB doesn't have built-in scheduling, you can:
- Save export queries as files
- Use macOS Automator to run exports
- Schedule with cron or launchd (advanced users)
- Export to monitored folders for automated processing
Format-Specific Guidance
CSV Export Best Practices
Data Preparation
- Clean special characters that might break CSV parsing:
-- Remove or escape commas in text fields
SELECT
id,
REPLACE(description, ',', ';') as description,
amount
FROM products;
- Handle line breaks in text fields:
-- Replace newlines with spaces
SELECT
id,
REPLACE(REPLACE(notes, CHR(10), ' '), CHR(13), ' ') as notes_clean
FROM customer_notes;
Import-Friendly CSV Settings
For maximum compatibility with other applications:
| Application | Recommended Settings | | ------------------- | ------------------------------------------------------- | | Excel | Comma delimiter, Double quote qualifier, UTF-8 encoding | | Google Sheets | Comma delimiter, Default encoding | | Python/R Pandas | Comma delimiter, No text qualifier | | PostgreSQL COPY | Comma delimiter, CSV header, Quote double quote |
JSON Export Best Practices
Structured Data Export
-- Export hierarchical data
SELECT
order_id,
order_date,
JSON_BUILD_OBJECT(
'customer_id', c.customer_id,
'name', c.name,
'email', c.email
) as customer,
JSON_AGG(
JSON_BUILD_OBJECT(
'product_id', p.product_id,
'name', p.name,
'quantity', oi.quantity,
'price', oi.price
)
) as items
FROM orders o
JOIN customers c ON o.customer_id = c.customer_id
JOIN order_items oi ON o.order_id = oi.order_id
JOIN products p ON oi.product_id = p.product_id
GROUP BY o.order_id, o.order_date, c.customer_id, c.name, c.email;
JSON Schema Considerations
- Consistent structure: Ensure all rows have same keys
- Data types: JSON preserves strings, numbers, booleans, null
- Nesting depth: Consider flattening for some applications
Performance Optimization
Large Dataset Exports
Memory Management
- Enable streaming mode in Preferences ā Performance
- Increase chunk size for faster exports (if sufficient RAM)
- Close other applications to free system resources
- Monitor Activity Monitor for memory usage
Network Considerations
For remote database exports:
- Export on server if possible (use SSH or remote desktop)
- Compress exports (HarborDB auto-compresses large JSON)
- Schedule off-peak hours for production databases
- Use WHERE clauses to limit data transfer
Export Speed Tips
- Select only needed columns (not SELECT *)
- Avoid expensive calculations in export queries
- Use indexed columns in WHERE clauses
- Export to SSD for faster disk I/O
- Disable antivirus real-time scanning on export folder (temporarily)
Security and Privacy
Sensitive Data Handling
Data Redaction
-- Export without sensitive information
SELECT
user_id,
LEFT(email, 3) || '***@***' as email_masked,
'***' as password_hash,
created_at
FROM users;
Compliance Considerations
- GDPR/CCPA: Remove personal identifiable information (PII)
- HIPAA: De-identify protected health information (PHI)
- PCI DSS: Never export full credit card numbers
- Internal policies: Follow your organization's data handling rules
Secure Export Practices
- Encrypt sensitive exports using macOS FileVault or third-party tools
- Use secure transfer methods for exported files (SFTP, encrypted email)
- Set appropriate file permissions (chmod 600 for sensitive files)
- Automatically delete temporary export files
- Audit export logs (available in Preferences ā Logs)
Troubleshooting Common Issues
CSV Problems
"Columns misaligned in Excel"
- Cause: Embedded commas or line breaks in data
- Solution: Use text qualifiers or clean data before export
"Character encoding issues"
- Cause: Non-UTF-8 characters in data
- Solution: Export with UTF-8 encoding with BOM for Excel
"Large file import errors"
- Cause: Excel row/column limits (1,048,576 rows, 16,384 columns)
- Solution: Split data or use CSV tools without limits
JSON Problems
"Invalid JSON syntax"
- Cause: Special characters not properly escaped
- Solution: Use PostgreSQL JSON functions to sanitize data
"Memory errors on large exports"
- Cause: Attempting to load entire dataset into memory
- Solution: Enable streaming export or export in chunks
"Nested data too complex"
- Cause: Deeply nested JSON difficult to parse
- Solution: Flatten structure or simplify query
General Issues
"Export too slow"
- Possible causes: Network latency, disk speed, query performance
- Debug steps:
- Test with small dataset first
- Check network connectivity
- Verify disk has free space
- Optimize database query
"Missing data in export"
- Check: WHERE clause filters, column selection, NULL handling
- Verify: Row count matches expectations, all columns included
"Permission denied errors**
- Solution:
- Choose different save location
- Check folder permissions
- Run HarborDB with appropriate privileges
- Disable security software temporarily
Integration with Other Tools
Automating with Scripts
Shell Script Example
#!/bin/bash
# Export daily report and email it
EXPORT_FILE="/Users/$(whoami)/Exports/daily_sales_$(date +%Y%m%d).csv"
# You would need to script HarborDB actions here
# Consider using PostgreSQL command line tools for automation
pg_dump -t sales -c --inserts your_database > "$EXPORT_FILE"
# Compress and email
gzip "$EXPORT_FILE"
echo "Daily sales export attached" | mail -s "Daily Sales Report" \
-a "$EXPORT_FILE.gz" team@example.com
Importing to Other Applications
Excel/Google Sheets
- CSV: Direct import via Data ā From Text/CSV
- JSON: Use Power Query (Excel) or Apps Script (Sheets)
Database Systems
-- PostgreSQL COPY command (from exported CSV)
COPY customers FROM '/path/to/export.csv' DELIMITER ',' CSV HEADER;
-- MySQL LOAD DATA
LOAD DATA LOCAL INFILE '/path/to/export.csv'
INTO TABLE customers
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
Programming Languages
# Python pandas example
import pandas as pd
# Read HarborDB CSV export
df = pd.read_csv('export.csv', encoding='utf-8')
print(f"Imported {len(df)} rows")
# Or JSON export
df_json = pd.read_json('export.json', orient='records')
Best Practices Summary
Before Exporting
- Test with small dataset first
- Verify query results match expectations
- Check available disk space
- Consider data sensitivity and compliance requirements
During Export
- Choose appropriate format for your use case
- Use descriptive file names with dates
- Include headers/metadata for clarity
- Monitor progress for large exports
After Exporting
- Verify file integrity (open and check sample)
- Secure sensitive files (encrypt if needed)
- Clean up temporary files
- Document export for reproducibility
Next Steps
Now that you can export data effectively:
- Query Basics - Master SQL query writing
- Database Navigation - Explore database structure
- Performance Optimization - Optimize export speed
- macOS Integration - Use native macOS features
Getting Help
If you encounter export issues:
- Check our Support Guide for troubleshooting
- Visit the FAQ for common questions
- Contact Support for personalized assistance
Was this helpful?
Help us improve this documentation by providing feedback.