DBLense Data Studio — User Guide

The all-in-one data management toolkit. Data Studio & restore, streaming server-to-server migration, cross-RDBMS transfers between SQL Server, MySQL, PostgreSQL, and Oracle, plus multi-format export/import to CSV, JSON, XML, SQL, Excel, TXT, and HTML. Single portable .exe — no installation required.

Getting Started

Double-click DBLenseDataStudio.exe — no installation, no prerequisites. The application opens with four tabs representing its four operating modes:

TabPurpose
⬆ BACKUPParallel multi-threaded database export with snapshot consistency and GZip compression
⬇ RESTOREParallel multi-threaded database restore with drop/recreate, identity insert, and constraint management
⇄ MIGRATEStreaming producer-consumer pipeline for server-to-server migration (export and import run concurrently)
📦 EXPORT / IMPORTCross-RDBMS data transfer and multi-format file export/import with automatic column mapping

Connecting to a Server

Click the Connect button on any tab to open the connection dialog. Data Studio supports four authentication modes:

ModeDescriptionUse Case
Windows AuthenticationUses your current Windows domain credentialsOn-premises Active Directory environments
SQL Server AuthenticationUsername and password stored in SQL ServerMixed-mode servers, cross-domain access
Azure Entra ID (Password)Azure AD username + passwordAzure SQL MI with non-MFA accounts
Azure Entra ID (MFA)Interactive Azure AD with multi-factorAzure SQL MI with MFA enabled
Connection settings are remembered between sessions. Passwords are encrypted using DPAPI (Windows Data Protection).

⬆ Backup (Parallel Export)

The Backup tab exports database data using a multi-threaded parallel engine. Each table is exported on its own thread for massive throughput gains over traditional single-threaded tools.

Quick Start

  1. Connect to your SQL Server instance
  2. Select one or more databases (or use Select All)
  3. Choose an output folder
  4. Configure options (threads, compression, snapshots)
  5. Click Start Export

Configuration Options

OptionDescriptionDefault
Output FolderDirectory where exported .bcp files are savedDesktop\DBLenseExport
Max ThreadsNumber of parallel export threads (2–16)4
FormatOutput format: native BCP or SQL scriptsNative
CompressionGZip compress output files (.gz)Off
Include SchemaExport DDL (CREATE TABLE) alongside dataOn
Database SnapshotCreate a snapshot for point-in-time consistent exportOn

Database Selection

After connecting, all databases are listed with checkboxes. Filter options:

You can also expand a database and select individual tables for a targeted backup.


Snapshot Consistency

When the Database Snapshot option is enabled, Data Studio creates a temporary database snapshot before exporting. This ensures:

Database snapshots require SQL Server Enterprise or Developer Edition on-premises. Azure SQL supports snapshots natively.

Compression

Enable GZip compression to reduce output file sizes by 60–80%. Especially useful for:

Compressed files have a .gz extension and can be restored by Data Studio or any standard GZip tool.


⬇ Restore (Parallel Import)

The Restore tab imports previously exported data using multi-threaded parallel processing — mirroring the speed gains of the Backup tab.

Quick Start

  1. Connect to the target SQL Server
  2. Browse to the export folder (Data Studio auto-detects databases and tables from the folder structure)
  3. Select what to restore
  4. Configure restore options
  5. Click Start Restore

Restore Options

OptionDescription
Truncate before restoreClear existing data before importing each table
Drop & recreate tablesRe-create table structure from exported schema DDL
Identity insertPreserve original identity column values (SET IDENTITY_INSERT ON)
Disable constraintsTemporarily disable foreign keys and check constraints during restore
Max ThreadsNumber of parallel restore threads (2–16)

⇄ Migrate (Streaming Pipeline)

The Migrate tab provides server-to-server migration using a streaming producer-consumer pipeline. Unlike traditional backup-then-restore workflows, tables begin importing on the target as soon as they finish exporting from the source.

How It Works

  1. Connect to both source and target SQL Server instances
  2. Select source databases to migrate
  3. Optionally override the target database name
  4. Configure thread count and options
  5. Click Start Migration

Internally, Data Studio uses Channel<T> (a .NET bounded channel) to coordinate producers (exporters) and consumers (importers). This pipeline architecture means:

Migration Options

OptionDescriptionDefault
Source ServerThe SQL Server instance to export from
Target ServerThe SQL Server instance to import into
Target Database OverrideRename the database on the target (e.g., Production → Staging)Same as source
Max ThreadsParallel threads for both export and import4
Use SnapshotsCreate a snapshot on the source for consistencyOn
Drop & RecreateDrop and recreate tables on the targetOff

📦 Export / Import (Cross-RDBMS & File Formats)

The Export/Import tab is the most versatile mode. It supports three transfer directions:

DirectionDescription
Database → DatabaseTransfer tables between any supported database platforms (e.g., MySQL → SQL Server)
Database → FileExport tables to CSV, JSON, XML, SQL, XLSX, TXT, or HTML files
File → DatabaseImport data from CSV, JSON, XML, SQL, or XLSX files into any supported database

Quick Start — Database to Database

  1. Select direction: Database → Database
  2. Configure Source: choose provider (SQL Server, MySQL, PostgreSQL, or Oracle), enter connection details, click Test Connection
  3. Configure Target: choose provider and connection details
  4. Click Discover Tables to load the source table list
  5. Select tables to transfer and review column mappings
  6. Click Start Transfer

Quick Start — Database to File

  1. Select direction: Database → File
  2. Configure source database connection
  3. Choose output folder and file format
  4. Configure format-specific options (delimiter, encoding, etc.)
  5. Select tables to export
  6. Click Start Transfer

Quick Start — File to Database

  1. Select direction: File → Database
  2. Browse to the source file (CSV, JSON, XML, SQL, or XLSX)
  3. Configure target database connection
  4. Data Studio auto-detects columns and types from the file
  5. Review column mappings and target table name
  6. Click Start Transfer

Supported Database Platforms

Data Studio can transfer data between any combination of these four platforms:

PlatformVersionsDriverBulk Insert Method
SQL Server2016+ / Azure MIMicrosoft.Data.SqlClientSqlBulkCopy (maximum throughput)
MySQL5.7+ / MariaDB 10+MySqlConnectorBatched INSERT statements (500 rows/batch)
PostgreSQL12+NpgsqlCOPY BINARY (blazing fast imports)
Oracle12c+Oracle.ManagedDataAccess.CoreINSERT ALL batched writes
When transferring between different platforms, Data Studio automatically maps data types. For example: INT↔NUMBER, VARCHAR↔TEXT, DATETIME↔TIMESTAMP, BOOLEAN↔BIT, SERIAL↔IDENTITY, and hundreds more.

File Formats

Data Studio supports 7 file formats for export and 5 for import:

CSV (Comma-Separated Values)

OptionDescriptionDefault
DelimiterField separator character, (comma)
Text QualifierCharacter to wrap string values" (double quote)
Include HeaderWrite column names as the first rowOn
EncodingFile encoding (UTF-8, UTF-16, ASCII)UTF-8

TXT (Tab-Delimited)

Same options as CSV but defaults to tab (\t) as delimiter. Ideal for pasting into spreadsheets.

JSON

OptionDescriptionDefault
Pretty PrintIndented, human-readable outputOn
WrapperArray ([ ]) or object with metadataArray

XML

OptionDescriptionDefault
Root ElementName of the root XML elementDataSet
Row ElementName of each row elementRow
Include SchemaEmbed XSD schema in the outputOff

Excel (XLSX)

Exports styled Excel workbooks with:

Import reads .xlsx files, using the first row as column headers, with automatic type detection from cell values.

SQL Script

OptionDescriptionDefault
Include CREATE TABLEGenerate CREATE TABLE DDL before INSERT statementsOff
Include DROP TABLEGenerate DROP TABLE IF EXISTS before CREATEOff
Batch SizeNumber of INSERT rows per GO batch1000

HTML

Generates a styled HTML document with a dark-themed table, ready for reports, presentations, or embedding in web pages. Export-only (not importable).


Automatic Column Mapping

When transferring between databases, Data Studio automatically:

  1. Discovers source columns — reads column names, data types, nullability, and constraints from the source
  2. Maps to target types — converts source types to equivalent target types (e.g., SQL Server NVARCHAR(MAX) → PostgreSQL TEXT)
  3. Generates target DDL — creates the target table if it doesn’t exist, using the mapped types

You can review and override any mapping before starting the transfer. The column mapping grid shows:

Common Type Mappings

SQL ServerMySQLPostgreSQLOracle
INTINTINTEGERNUMBER(10)
BIGINTBIGINTBIGINTNUMBER(19)
VARCHAR(n)VARCHAR(n)VARCHAR(n)VARCHAR2(n)
NVARCHAR(MAX)LONGTEXTTEXTCLOB
DATETIME2DATETIME(6)TIMESTAMPTIMESTAMP
BITTINYINT(1)BOOLEANNUMBER(1)
DECIMAL(p,s)DECIMAL(p,s)NUMERIC(p,s)NUMBER(p,s)
VARBINARY(MAX)LONGBLOBBYTEABLOB
UNIQUEIDENTIFIERCHAR(36)UUIDRAW(16)

Error Handling

Data Studio provides three error handling modes for transfers:

ModeBehavior
PromptPause on each error and ask whether to skip or abort
Ignore AllLog the error and continue with remaining tables
Abort on FirstStop the entire transfer immediately on the first error

All errors are captured in the results summary with table name, error message, and row count at time of failure. You can optionally save an error log file for later analysis.


Azure & Cloud Integration

Azure SQL Managed Instance

Data Studio fully supports Azure SQL MI with auto-detection:

Azure Blob Storage

Upload and download backups directly to/from Azure Blob Storage:

  1. Enter your Blob container SAS URL
  2. After backup completes, click Upload to Azure
  3. For restore, click Download from Azure to pull backup files from the container
The SAS URL should have Read, Write, and List permissions on the container. Generate it from the Azure portal under Storage Account → Containers → Shared Access Signature.

Progress Tracking

All four modes display live progress information:

The progress DataGrid can be sorted by status, table name, or row count. After completion, a summary shows total rows transferred, errors encountered, and elapsed time.


Typical Workflows

1. Nightly Backup with Cloud Upload

  1. Select all user databases on the Backup tab
  2. Enable snapshot consistency and GZip compression
  3. Set Max Threads to 4–8 (balance speed vs. server load)
  4. Export to a local folder
  5. Upload to Azure Blob Storage for offsite archival

2. Environment Refresh (Dev/Staging from Production)

  1. Use the Migrate tab for streaming server-to-server transfer
  2. Connect to Production as source, Dev/Staging as target
  3. Override target database name if needed
  4. Enable “Drop & Recreate” for a clean refresh
  5. Tables are available on the target in seconds, not hours

3. Cross-Platform Migration (MySQL → SQL Server)

  1. Open the Export/Import tab, direction: Database → Database
  2. Source: MySQL provider, enter host, port, username, password
  3. Target: SQL Server provider, enter connection details
  4. Click Discover Tables
  5. Select tables — Data Studio maps INTINT, VARCHARNVARCHAR, DATETIMEDATETIME2, etc.
  6. Click Start Transfer

4. Export Data for Stakeholders

  1. Direction: Database → File
  2. Connect to the source database
  3. Select tables to export
  4. Choose Excel (XLSX) for business users, JSON for APIs, SQL for deployment scripts
  5. Click Start Transfer — one file per table in the output folder

5. Bulk Import from CSV/JSON

  1. Direction: File → Database
  2. Browse to the source file
  3. Configure target database connection
  4. Review auto-detected columns and types
  5. Click Start Transfer — the target table is auto-created if it doesn’t exist

6. Generate Deployment SQL Scripts

  1. Direction: Database → File, format: SQL Script
  2. Enable Include CREATE TABLE and Include DROP TABLE
  3. Export — each table produces a .sql file with DROP, CREATE, and batched INSERT statements
  4. Run the scripts on the target server using Script Runner or SSMS

System Requirements

ComponentRequirement
OSWindows 10/11 or Windows Server 2016+
Runtime.NET 8.0 (or self-contained build)
SQL Server2016+ / Azure SQL MI
MySQL5.7+ / MariaDB 10+
PostgreSQL12+
Oracle12c+
Disk SpaceSufficient for exported data (compressed or uncompressed)
InstallationNone — single portable .exe
PermissionsRead access (export), db_owner or equivalent (restore/migrate/import)