Exporting Data in HarborDB

Guides
•Last updated: February 16, 2026

HarborDB provides powerful export capabilities to save your query results in various formats. Whether you need to share data with colleagues, import into other applications, or archive results for later analysis, this guide covers everything you need to know about exporting data efficiently and securely.

Export Formats Overview

HarborDB supports multiple export formats, each suited for different use cases:

CSV (Comma-Separated Values)

  • Best for: Spreadsheets (Excel, Google Sheets), data import/export, basic data exchange
  • Features: Custom delimiters, headers, text qualifiers, encoding options
  • Limitations: No support for hierarchical data, limited data type preservation

JSON (JavaScript Object Notation)

  • Best for: Web applications, APIs, NoSQL databases, data interchange
  • Features: Pretty printing, compact formatting, preserve data types
  • Limitations: Larger file size, not ideal for spreadsheets

Excel (Coming Soon)

  • Best for: Business reporting, formatted spreadsheets, complex data presentation
  • Features: Multiple sheets, formulas, formatting, charts
  • Status: Planned for future release

Basic Export Process

Step 1: Prepare Your Data

  1. Execute a query or select a table in the sidebar
  2. Review the results to ensure they contain the data you want to export
  3. Consider filtering if you only need specific rows or columns

Step 2: Initiate Export

There are several ways to start an export:

From Query Results

  1. Click the Export button (šŸ“¤) in the results toolbar
  2. Or right-click anywhere in the results grid
  3. Select "Export Results" from the context menu

From Table Navigation

  1. Right-click a table in the sidebar
  2. Select "Export Data"
  3. Choose export scope:
    • Entire table: All data (use with caution on large tables)
    • First N rows: Limited sample (recommended for testing)
    • Custom query: Write specific SELECT statement

Keyboard Shortcuts

  • ⌘ + E: Quick export dialog
  • ⌘ + Shift + E: Export with last used settings

Step 3: Configure Export Settings

The export dialog provides configuration options:

General Settings

  • File Name: Auto-generated or custom name
  • Save Location: Choose folder destination
  • Include Headers: Column names as first row (CSV)
  • Encoding: UTF-8 (recommended), Latin-1, or system default

CSV Specific Options

  • Delimiter: Comma, semicolon, tab, pipe, or custom character
  • Text Qualifier: Quotes, double quotes, or none
  • Line Endings: macOS/Linux (LF), Windows (CRLF), or system default
  • Null Representation: Empty string, "NULL", or custom text

JSON Specific Options

  • Format: Pretty (human-readable) or Compact (minimized)
  • Array Format: Array of objects or object with data property
  • Include Metadata: Query execution info, column types, timestamps

Step 4: Execute and Verify

  1. Click "Export" to save the file
  2. Monitor progress for large exports
  3. Verify file size and content
  4. Open exported file to confirm formatting

Advanced Export Techniques

Exporting Specific Data Subsets

Column Selection

Export only specific columns by modifying your query:

-- Instead of SELECT *
SELECT customer_id, email, last_purchase_date
FROM customers
WHERE active = true;

Row Filtering

Use WHERE clauses to limit exported data:

-- Export recent data only
SELECT * FROM orders
WHERE order_date >= CURRENT_DATE - INTERVAL '30 days';

Aggregated Data

Export summarized results:

-- Export daily sales totals
SELECT
    DATE(order_date) as day,
    COUNT(*) as order_count,
    SUM(total_amount) as daily_revenue
FROM orders
GROUP BY DATE(order_date)
ORDER BY day DESC;

Batch Exporting

For large datasets that exceed memory limits:

Chunked Exports

-- Export in batches of 10,000 rows
SELECT * FROM large_table
ORDER BY id
LIMIT 10000 OFFSET 0;

-- Then increment OFFSET for next batch
SELECT * FROM large_table
ORDER BY id
LIMIT 10000 OFFSET 10000;

Server-Side Cursors

Enable server-side cursors in Preferences → Performance for streaming large result sets without memory issues.

Scheduled Exports (Advanced)

While HarborDB doesn't have built-in scheduling, you can:

  1. Save export queries as files
  2. Use macOS Automator to run exports
  3. Schedule with cron or launchd (advanced users)
  4. Export to monitored folders for automated processing

Format-Specific Guidance

CSV Export Best Practices

Data Preparation

  1. Clean special characters that might break CSV parsing:
-- Remove or escape commas in text fields
SELECT
    id,
    REPLACE(description, ',', ';') as description,
    amount
FROM products;
  1. Handle line breaks in text fields:
-- Replace newlines with spaces
SELECT
    id,
    REPLACE(REPLACE(notes, CHR(10), ' '), CHR(13), ' ') as notes_clean
FROM customer_notes;

Import-Friendly CSV Settings

For maximum compatibility with other applications:

| Application | Recommended Settings | | ------------------- | ------------------------------------------------------- | | Excel | Comma delimiter, Double quote qualifier, UTF-8 encoding | | Google Sheets | Comma delimiter, Default encoding | | Python/R Pandas | Comma delimiter, No text qualifier | | PostgreSQL COPY | Comma delimiter, CSV header, Quote double quote |

JSON Export Best Practices

Structured Data Export

-- Export hierarchical data
SELECT
    order_id,
    order_date,
    JSON_BUILD_OBJECT(
        'customer_id', c.customer_id,
        'name', c.name,
        'email', c.email
    ) as customer,
    JSON_AGG(
        JSON_BUILD_OBJECT(
            'product_id', p.product_id,
            'name', p.name,
            'quantity', oi.quantity,
            'price', oi.price
        )
    ) as items
FROM orders o
JOIN customers c ON o.customer_id = c.customer_id
JOIN order_items oi ON o.order_id = oi.order_id
JOIN products p ON oi.product_id = p.product_id
GROUP BY o.order_id, o.order_date, c.customer_id, c.name, c.email;

JSON Schema Considerations

  • Consistent structure: Ensure all rows have same keys
  • Data types: JSON preserves strings, numbers, booleans, null
  • Nesting depth: Consider flattening for some applications

Performance Optimization

Large Dataset Exports

Memory Management

  1. Enable streaming mode in Preferences → Performance
  2. Increase chunk size for faster exports (if sufficient RAM)
  3. Close other applications to free system resources
  4. Monitor Activity Monitor for memory usage

Network Considerations

For remote database exports:

  1. Export on server if possible (use SSH or remote desktop)
  2. Compress exports (HarborDB auto-compresses large JSON)
  3. Schedule off-peak hours for production databases
  4. Use WHERE clauses to limit data transfer

Export Speed Tips

  1. Select only needed columns (not SELECT *)
  2. Avoid expensive calculations in export queries
  3. Use indexed columns in WHERE clauses
  4. Export to SSD for faster disk I/O
  5. Disable antivirus real-time scanning on export folder (temporarily)

Security and Privacy

Sensitive Data Handling

Data Redaction

-- Export without sensitive information
SELECT
    user_id,
    LEFT(email, 3) || '***@***' as email_masked,
    '***' as password_hash,
    created_at
FROM users;

Compliance Considerations

  1. GDPR/CCPA: Remove personal identifiable information (PII)
  2. HIPAA: De-identify protected health information (PHI)
  3. PCI DSS: Never export full credit card numbers
  4. Internal policies: Follow your organization's data handling rules

Secure Export Practices

  1. Encrypt sensitive exports using macOS FileVault or third-party tools
  2. Use secure transfer methods for exported files (SFTP, encrypted email)
  3. Set appropriate file permissions (chmod 600 for sensitive files)
  4. Automatically delete temporary export files
  5. Audit export logs (available in Preferences → Logs)

Troubleshooting Common Issues

CSV Problems

"Columns misaligned in Excel"

  • Cause: Embedded commas or line breaks in data
  • Solution: Use text qualifiers or clean data before export

"Character encoding issues"

  • Cause: Non-UTF-8 characters in data
  • Solution: Export with UTF-8 encoding with BOM for Excel

"Large file import errors"

  • Cause: Excel row/column limits (1,048,576 rows, 16,384 columns)
  • Solution: Split data or use CSV tools without limits

JSON Problems

"Invalid JSON syntax"

  • Cause: Special characters not properly escaped
  • Solution: Use PostgreSQL JSON functions to sanitize data

"Memory errors on large exports"

  • Cause: Attempting to load entire dataset into memory
  • Solution: Enable streaming export or export in chunks

"Nested data too complex"

  • Cause: Deeply nested JSON difficult to parse
  • Solution: Flatten structure or simplify query

General Issues

"Export too slow"

  • Possible causes: Network latency, disk speed, query performance
  • Debug steps:
    1. Test with small dataset first
    2. Check network connectivity
    3. Verify disk has free space
    4. Optimize database query

"Missing data in export"

  • Check: WHERE clause filters, column selection, NULL handling
  • Verify: Row count matches expectations, all columns included

"Permission denied errors**

  • Solution:
    1. Choose different save location
    2. Check folder permissions
    3. Run HarborDB with appropriate privileges
    4. Disable security software temporarily

Integration with Other Tools

Automating with Scripts

Shell Script Example

#!/bin/bash
# Export daily report and email it
EXPORT_FILE="/Users/$(whoami)/Exports/daily_sales_$(date +%Y%m%d).csv"

# You would need to script HarborDB actions here
# Consider using PostgreSQL command line tools for automation
pg_dump -t sales -c --inserts your_database > "$EXPORT_FILE"

# Compress and email
gzip "$EXPORT_FILE"
echo "Daily sales export attached" | mail -s "Daily Sales Report" \
  -a "$EXPORT_FILE.gz" team@example.com

Importing to Other Applications

Excel/Google Sheets

  1. CSV: Direct import via Data → From Text/CSV
  2. JSON: Use Power Query (Excel) or Apps Script (Sheets)

Database Systems

-- PostgreSQL COPY command (from exported CSV)
COPY customers FROM '/path/to/export.csv' DELIMITER ',' CSV HEADER;

-- MySQL LOAD DATA
LOAD DATA LOCAL INFILE '/path/to/export.csv'
INTO TABLE customers
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;

Programming Languages

# Python pandas example
import pandas as pd

# Read HarborDB CSV export
df = pd.read_csv('export.csv', encoding='utf-8')
print(f"Imported {len(df)} rows")

# Or JSON export
df_json = pd.read_json('export.json', orient='records')

Best Practices Summary

Before Exporting

  1. Test with small dataset first
  2. Verify query results match expectations
  3. Check available disk space
  4. Consider data sensitivity and compliance requirements

During Export

  1. Choose appropriate format for your use case
  2. Use descriptive file names with dates
  3. Include headers/metadata for clarity
  4. Monitor progress for large exports

After Exporting

  1. Verify file integrity (open and check sample)
  2. Secure sensitive files (encrypt if needed)
  3. Clean up temporary files
  4. Document export for reproducibility

Next Steps

Now that you can export data effectively:

  1. Query Basics - Master SQL query writing
  2. Database Navigation - Explore database structure
  3. Performance Optimization - Optimize export speed
  4. macOS Integration - Use native macOS features

Getting Help

If you encounter export issues:

  1. Check our Support Guide for troubleshooting
  2. Visit the FAQ for common questions
  3. Contact Support for personalized assistance

Was this helpful?

Help us improve this documentation by providing feedback.