Table of Contents

Class ExportConfig

Namespace
YndigoBlue.Velocity.Model
Assembly
YndigoBlue.Velocity.dll

Configuration for exporting query results to CSV files.

public class ExportConfig
Inheritance
ExportConfig

Remarks

ExportConfig controls how database query results are formatted and written to CSV files when using ExportData(Table, string, ExportConfig) as an example. It handles:

  • CSV format settings (delimiters, quote characters, line terminators)
  • Date/time output formats
  • Header row generation
  • Field quoting behavior
  • Character trimming

ExportConfig provides two initialization modes:

  • ISO Format (default): Uses standard ISO date formats and comma delimiters
  • Culture Format: Uses current culture's date formats and list separators

The configuration ensures exported CSV files are properly formatted and can be easily imported into spreadsheet applications, other databases, or re-imported using ImportConfig.

Examples

Basic export with default settings:

using (Manager manager = new Manager(connection))
{
    manager.LoadSchema("my_schema");
    Schema schema = manager.GetSchema("my_schema");
    Table usersTable = schema["users"];

    // Create query
    Query query = new Query();
    query.AddFromItem(usersTable);
    query.AddSelectItem(usersTable["id"]);
    query.AddSelectItem(usersTable["username"]);
    query.AddSelectItem(usersTable["email"]);
    query.AddSelectItem(usersTable["created_at"]);

    // Create export config with default ISO format
    ExportConfig config = new ExportConfig();

    // Export to CSV
    string csvFilePath = @"C:\exports\users.csv";
    int rowsExported = manager.Export(config, query, csvFilePath);

    Console.WriteLine($"Exported {rowsExported} rows");
}

// Output CSV format:
// id,username,email,created_at
// 1,john_doe,john@example.com,2025-01-07 10:30:00
// 2,jane_smith,jane@example.com,2025-01-07 11:45:00

Export with custom CSV format:

// Create config for tab-delimited output
ExportConfig config = new ExportConfig(DateFormatType.Iso);
config.FieldDelimiter = '\t';  // Tab-separated values
config.QuoteCharacter = '\'';  // Single quotes
config.ForceQualifier = true;  // Always quote fields
config.WriteHeaders = true;

int rowsExported = manager.Export(config, query, csvFilePath);

// Output format:
// 'id'\t'username'\t'email'
// '1'\t'john_doe'\t'john@example.com'

Export with custom date formats:

// Configure for European date format
ExportConfig config = new ExportConfig(DateFormatType.Culture);
config.DateFormat = "dd/MM/yyyy";
config.DateTimeFormat = "dd/MM/yyyy HH:mm:ss";
config.TimestampFormat = "dd/MM/yyyy HH:mm:ss zzz";

int rowsExported = manager.Export(config, query, csvFilePath);

// Output format:
// id,username,created_at
// 1,john_doe,07/01/2025 10:30:00
// 2,jane_smith,07/01/2025 11:45:00

Export without headers:

ExportConfig config = new ExportConfig();
config.WriteHeaders = false;  // Don't write header row

int rowsExported = manager.Export(config, query, csvFilePath);

// Output format (no headers):
// 1,john_doe,john@example.com,2025-01-07 10:30:00
// 2,jane_smith,jane@example.com,2025-01-07 11:45:00

Export with forced field quoting:

ExportConfig config = new ExportConfig();
config.ForceQualifier = true;  // Quote all fields
config.QuoteCharacter = '"';

int rowsExported = manager.Export(config, query, csvFilePath);

// Output format:
// "id","username","email","created_at"
// "1","john_doe","john@example.com","2025-01-07 10:30:00"
// "2","jane_smith","jane@example.com","2025-01-07 11:45:00"

Export pipe-delimited file:

ExportConfig config = new ExportConfig();
config.FieldDelimiter = '|';
config.QuoteCharacter = '"';
config.LineTerminator = "\r\n";  // Windows line endings

int rowsExported = manager.Export(config, query, csvFilePath);

// Output format:
// id|username|email
// 1|john_doe|john@example.com
// 2|jane_smith|jane@example.com

Export with character trimming:

ExportConfig config = new ExportConfig();
config.TrimChars = true;  // Trim leading/trailing whitespace (default)

// Or disable trimming to preserve all whitespace
config.TrimChars = false;

int rowsExported = manager.Export(config, query, csvFilePath);

Export using current culture format:

// Use system's regional settings for dates and list separator
ExportConfig config = new ExportConfig(DateFormatType.Culture);

// This will use:
// - Current culture's date format
// - Current culture's list separator (comma, semicolon, etc.)
// - Current culture's datetime patterns

int rowsExported = manager.Export(config, query, csvFilePath);

Export with custom escape character:

ExportConfig config = new ExportConfig();
config.EscapeCharacter = "\\\\";  // Double backslash for escaping
config.QuoteCharacter = '"';

// Fields with quotes will be escaped:
// "Product with \"quotes\" in name"

int rowsExported = manager.Export(config, query, csvFilePath);

Complete export configuration example:

ExportConfig config = new ExportConfig(DateFormatType.Iso);

// CSV format settings
config.FieldDelimiter = ',';
config.QuoteCharacter = '"';
config.EscapeCharacter = "\\\\";
config.LineTerminator = Environment.NewLine;
config.ForceQualifier = false;  // Only quote when necessary

// Output settings
config.WriteHeaders = true;
config.TrimChars = true;

// Date/time formats
config.DateFormat = "yyyy-MM-dd";
config.DateTimeFormat = "yyyy-MM-dd HH:mm:ss";
config.TimestampFormat = "yyyy-MM-dd HH:mm:ss zzz";

int rowsExported = manager.Export(config, query, csvFilePath);

Constructors

ExportConfig(DateFormatType)

Properties

BlobColumns

Global per-column blob handling configuration applied to any table. Key is the database column name. Prefer SetBlobColumnBase64(string, string) or SetBlobColumnFile(string, string, string) when exporting schemas or databases that contain tables with identically-named blob columns but different handling requirements.

DateFormat
DateTimeFormat
EscapeCharacter
FieldDelimiter
ForceQualifier
LineTerminator
QuoteCharacter
TimestampFormat
TrimChars
WriteHeaders

Methods

GetBlobConfig(string, string, string)

Resolves the blob handling configuration for a column using a three-level fallback: schema+table (most specific) → table → global. Returns null when no config has been registered for the column at any level.

HasAnyBlobConfig()

Returns true if any blob column configuration has been registered at any scope (global, table, or schema+table).

SetBlobColumnBase64(string)

Configures a blob column to be exported as a Base64-encoded string in the CSV cell. Applies globally to any table that has a column with this name.

SetBlobColumnBase64(string, string)

Configures a blob column to be exported as a Base64-encoded string in the CSV cell, scoped to a specific table. Takes precedence over the global overload when the same column name exists in multiple tables with different handling requirements.

SetBlobColumnBase64(string, string, string)

Configures a blob column to be exported as a Base64-encoded string in the CSV cell, scoped to a specific schema and table. Takes precedence over both the table-scoped and global overloads — use this when exporting a database where multiple schemas contain tables with the same name but different blob handling requirements.

SetBlobColumnFile(string, string)

Configures a blob column to be exported as an external file whose path is derived from filePathPattern. Applies globally to any table that has a column with this name. Use {column_name} tokens to reference other column values. Files are written relative to the CSV file's directory. The resolved path is also written into the CSV cell. Example: attachments/{category}/{filename}

SetBlobColumnFile(string, string, string)

Configures a blob column to be exported as an external file, scoped to a specific table. Takes precedence over the global overload when the same column name exists in multiple tables with different handling requirements.

SetBlobColumnFile(string, string, string, string)

Configures a blob column to be exported as an external file, scoped to a specific schema and table. Takes precedence over both the table-scoped and global overloads — use this when exporting a database where multiple schemas contain tables with the same name but different blob handling requirements.