Skip to content

APIC Database Reference

Provenance: Carried over from a separate project (Tyler's prior work). Treat as a factual reference for what APIC's database actually contains and the quality of that data, not as a design document for our own platform. Product-side conclusions are summarized in apic.md in this same directory.

What is APIC?

APIC (Australian Product Information Centre) is a centralised product catalogue and ordering system used by Australian music industry distributors. It provides retailers with a single database covering products from multiple suppliers/distributors, including pricing, stock availability, images, and basic product metadata.

The database is distributed to retailers via FTP as a Sybase SQL Anywhere (ASA) database file (apic5.db). Retailers run APIC's MusiPOS client software which connects to this local database for browsing, ordering, and syncing product data.

How the Data is Retrieved

APIC distributes database snapshots via FTP.

FTP Connection Details

Setting Value
Hostname ftp.hcgsoft.com.au
Port 21 (standard FTP)
Username apicdownloaduser
Password apicdownloaduser$123#

This is a public/shared login used by APIC retailers. Connect with any FTP client (FileZilla, WinSCP, lftp, etc.).

What to Download

The key file is inside the auapic5new/ folder:

auapic5new/apic5.db    # Sybase SQL Anywhere database (binary, proprietary format)

This is the only file needed. The FTP folder may also contain apic5.log (transaction log), an export/ folder with pre-extracted .dat files, and a reload.sql schema file — but these are optional and the conversion process below works directly from the .db file.

The .db file is a proprietary binary format that cannot be read directly without the SAP SQL Anywhere database engine. It must be extracted using SQL Anywhere's dbunload tool, then converted to SQLite. The steps below handle this.

Converting to SQLite

Three stages: upgrade the .db file to a compatible version (if needed), extract the data to CSV using dbunload, then convert the CSVs to SQLite using a Python script.

Prerequisites

  • SAP SQL Anywhere 17 (or compatible version) — needed to extract data from the .db file. A developer/evaluation edition will work. Download from SAP's site or use an existing MusiPOS installation which bundles it.
  • Python 3.6+ — for the CSV-to-SQLite conversion (stdlib only, no pip packages needed)

Stage 1: Upgrade the .db File (if needed)

The .db files on the FTP server may be in an older Sybase format (e.g. ASA 8.0) that SQL Anywhere 17 cannot open directly. If so, use dbunload -an to rebuild the database into a v17-compatible .db file:

dbunload -c "DBF=apic5.db;UID=DBA;PWD=sql" -an apic5_v17.db

This reads the old-format .db file and creates a new v17-compatible .db file. If the file is already v17-compatible, you can skip this step.

Stage 2: Extract Data to CSV

Once you have a v17-compatible .db file, start the database server and use dbunload to extract the data into numbered CSV files and a reload.sql schema file:

# 1. Start the database server
dbeng17 apic5_v17.db -n apic5

# 2. In a separate terminal, extract all tables to an export folder
dbunload -c "UID=DBA;PWD=sql;SERVER=apic5" export_dir
  • -n apic5 names the server instance
  • UID=DBA;PWD=sql — default SQL Anywhere credentials
  • export_dir — output directory for the extracted files

Stop the server afterwards with Ctrl+C or dbstop -y -c "UID=DBA;PWD=sql;SERVER=apic5".

This produces an export_dir/ folder containing numbered .dat files (one per table) and a reload.sql with the full schema. The .dat files are CSVs with this specific format:

  • Encoding: cp1252 (Windows Latin-1)
  • Delimiter: comma
  • Quote character: single quote (')
  • Escape character: backslash (\)
  • NULL representation: empty field or (NULL)

Stage 3: Convert Extracted CSVs to SQLite

Once you have the export folder (containing reload.sql and numbered .dat files), use the Python script below. It is schema-agnostic — it parses reload.sql to discover all tables, column names, types, and .dat file mappings automatically. It works with any APIC database export (apic5, ApicSA, etc.).

python convert_apic_to_sqlite.py <export_dir> [output.db]
  • export_dir — the folder containing reload.sql and .dat files (or a parent folder that has reload.sql with an export/ subfolder)
  • output.db — optional, defaults to apic_sqlite.db in the current directory
"""
APIC Sybase Export to SQLite Converter (schema-agnostic)

Parses reload.sql to discover all tables, columns, types, and .dat file mappings.
Works with any APIC database export (apic5, ApicSA, etc.) without hardcoded schema.

Usage:
    python convert_apic_to_sqlite.py <export_dir> [output.db]

    export_dir  - folder created by dbunload. Can be either:
                  - the folder containing reload.sql directly, or
                  - a parent folder that has reload.sql and an export/ subfolder
    output.db   - output SQLite file (default: apic_sqlite.db in current dir)
"""
import sqlite3, csv, os, sys, re


def find_paths(base_dir):
    """Locate reload.sql and the .dat files directory."""
    # Check for reload.sql directly in base_dir
    reload_path = os.path.join(base_dir, 'reload.sql')
    if os.path.exists(reload_path):
        # .dat files might be in same dir or in export/ subfolder
        export_sub = os.path.join(base_dir, 'export')
        if os.path.isdir(export_sub):
            return reload_path, export_sub
        return reload_path, base_dir

    # Check common subfolder patterns
    for sub in os.listdir(base_dir):
        sub_path = os.path.join(base_dir, sub)
        if os.path.isdir(sub_path):
            reload_candidate = os.path.join(sub_path, 'reload.sql')
            if os.path.exists(reload_candidate):
                export_sub = os.path.join(sub_path, 'export')
                if os.path.isdir(export_sub):
                    return reload_candidate, export_sub
                return reload_candidate, sub_path

    return None, None


def parse_reload_sql(reload_path):
    """Parse reload.sql to extract table schemas and .dat file mappings."""
    with open(reload_path, 'r', encoding='utf-8', errors='replace') as f:
        content = f.read()

    # Parse CREATE TABLE statements for column names and types
    table_schemas = {}
    for match in re.finditer(
        r'CREATE TABLE "DBA"\."(\w+)"\s*\((.*?)\)\s*\ngo',
        content, re.DOTALL
    ):
        table_name = match.group(1)
        body = match.group(2)
        columns = []
        for col_match in re.finditer(
            r'"(\w+)"\s+(char|varchar|integer|smallint|numeric|decimal|float|'
            r'double|real|timestamp|date|time|long varchar|binary|bit|bigint|'
            r'tinyint|money|smallmoney)(?:\([^)]*\))?',
            body, re.IGNORECASE
        ):
            col_name = col_match.group(1)
            col_type = col_match.group(2).upper()
            if col_type in ('INTEGER', 'SMALLINT', 'BIGINT', 'TINYINT', 'BIT'):
                sqlite_type = 'INTEGER'
            elif col_type in ('NUMERIC', 'DECIMAL', 'FLOAT', 'DOUBLE',
                              'REAL', 'MONEY', 'SMALLMONEY'):
                sqlite_type = 'REAL'
            else:
                sqlite_type = 'TEXT'
            columns.append((col_name, sqlite_type))
        if columns:
            table_schemas[table_name] = columns

    # Parse LOAD TABLE statements for .dat file number mappings
    table_files = {}
    for match in re.finditer(
        r'LOAD TABLE "DBA"\."(\w+)"\s*\(([^)]+)\)\s*\n\s*FROM '
        r"'[^']*?(\d+)\.dat'",
        content
    ):
        table_name = match.group(1)
        col_list_raw = match.group(2)
        file_num = int(match.group(3))
        load_columns = re.findall(r'"(\w+)"', col_list_raw)
        table_files[table_name] = {
            'file_num': file_num,
            'load_columns': load_columns
        }

    # Merge: LOAD TABLE column order (matches .dat) with CREATE TABLE types
    tables = []
    for table_name, file_info in table_files.items():
        schema = table_schemas.get(table_name, {})
        schema_dict = dict(schema) if schema else {}
        cols = []
        for col_name in file_info['load_columns']:
            sqlite_type = schema_dict.get(col_name, 'TEXT')
            cols.append((col_name, sqlite_type))
        tables.append((table_name, file_info['file_num'], cols))

    tables.sort(key=lambda t: t[1])
    return tables


def convert(export_dir, output_db):
    reload_path, dat_dir = find_paths(export_dir)

    if not reload_path:
        print(f"ERROR: Could not find reload.sql in or under: {export_dir}")
        print("Make sure you point to the directory created by dbunload,")
        print("or a parent directory containing it.")
        sys.exit(1)

    print(f"Schema:    {reload_path}")
    print(f"Data dir:  {dat_dir}")

    tables = parse_reload_sql(reload_path)
    print(f"Found {len(tables)} tables\n")

    if os.path.exists(output_db):
        os.remove(output_db)

    conn = sqlite3.connect(output_db)
    conn.execute("PRAGMA journal_mode=WAL")
    conn.execute("PRAGMA synchronous=OFF")
    conn.execute("PRAGMA cache_size=-64000")
    c = conn.cursor()
    total = 0
    loaded = 0
    skipped = 0

    for tname, dnum, cols in tables:
        dat = os.path.join(dat_dir, f"{dnum}.dat")
        if not os.path.exists(dat):
            skipped += 1
            continue

        col_defs = ", ".join(f'"{n}" {t}' for n, t in cols)
        c.execute(f'CREATE TABLE IF NOT EXISTS "{tname}" ({col_defs})')

        ncols = len(cols)
        col_types = [t for _, t in cols]
        placeholders = ",".join(["?"] * ncols)
        insert = f'INSERT OR IGNORE INTO "{tname}" VALUES ({placeholders})'

        count = 0
        with open(dat, 'r', encoding='cp1252', errors='replace') as f:
            reader = csv.reader(
                f, delimiter=',', quotechar="'",
                doublequote=False, escapechar='\\'
            )
            batch = []
            for row in reader:
                if len(row) < ncols:
                    continue
                if len(row) > ncols:
                    row = row[:ncols]
                cleaned = []
                for i, val in enumerate(row):
                    v = val.strip() if val else None
                    if v == '' or v == '(NULL)':
                        v = None
                    elif col_types[i] == 'INTEGER':
                        try: v = int(v)
                        except: v = None
                    elif col_types[i] == 'REAL':
                        try: v = float(v)
                        except: v = None
                    cleaned.append(v)
                batch.append(cleaned)
                if len(batch) >= 10000:
                    c.executemany(insert, batch)
                    count += len(batch)
                    batch = []
            if batch:
                c.executemany(insert, batch)
                count += len(batch)
        conn.commit()
        total += count
        loaded += 1
        print(f"  {tname}: {count:,} rows")

    try:
        conn.execute("VACUUM")
    except sqlite3.OperationalError:
        pass  # VACUUM can fail on low disk; data is still intact
    conn.close()
    sz = os.path.getsize(output_db)
    print(f"\n{loaded} tables loaded, {skipped} skipped (no .dat file)")
    print(f"{total:,} total rows | {sz/(1024*1024):.1f} MB")
    print(f"Saved to: {output_db}")


if __name__ == "__main__":
    if len(sys.argv) < 2:
        print(__doc__)
        sys.exit(1)

    export_dir = sys.argv[1]
    output_db = sys.argv[2] if len(sys.argv) > 2 else "apic_sqlite.db"
    convert(export_dir, output_db)

File Number to Table Mapping

The dbunload tool assigns sequential file numbers to tables. The numbers below correspond to the known APIC schema:

.dat file Table Description
393 wrdaud Word audit (internal)
394–398 pbcat* PowerBuilder catalogue metadata (internal)
399 ap4itm Main product catalogue (671K rows)
400 ap5itm Product catalogue v5 (empty — data is in ap4itm)
401–402 ap4wrd, ap5wrd Word/search index tables (empty)
403 ap4rsp Supplier/distributor directory (238 suppliers)
404–405 ap4dda, ap4ddc Document/display definitions (empty)
406 ap4ins Instrument category codes (140 codes)
407 ap4sub Sub-department codes (1,019 codes)
408 ap4trd Trade/pricing tier definitions (16 rows)
409 ap4amb AMEB (exam board) items (empty)
410 ap4ctl Control/config records (50 rows)
411 ap4hlp Help text (11 rows)
412 ap4int Interest category codes (22 codes)
413 sp4cum Customer record (1 row — the local dealer)
414–416 sp4phd, sp4pop, sp4qpc Purchase orders / pricing (empty — local data)
417 sp4shp Shop/dealer info (1 row)
418 wrdaud_copy Word audit copy (empty)
419 ap4rsp_temp Supplier directory backup (238 rows)
420 ap4rgr AMEB grade codes (11 grades)
421 ap4rls Series/list codes (8 codes)
422 sp4prc Price increase records (empty)
423 sp4smm Shop summary/memo (1 row)
424 ab6ran Random number audit (2 rows)
426 item_images Product image filenames (288K rows)
427 item_specs Product specification text (108K rows)
428–429 excdealer, excdealer_sub Exclusive dealer restrictions
430 item_mp Media/product files (2 rows)
431 item_pdf Product PDF documents (1,258 rows)
432 item_sa Stock availability flags (495K rows)
433 sp4cum_hcg Extended customer records (473 rows)
434 item_specials On-special flags (empty)
435 cust_module_list Dealer module registrations (388 rows)
436 item_measure Weight, dimensions, UPC/EAN (47K rows)
437–438 ap4itm_new, ap5itm_new Mirror copies of product tables

Database Schema — Key Tables

ap4itm — Main Product Catalogue (671,132 rows)

The core table. Every product in the APIC system has a row here.

Column Type Description
itm_iid TEXT Primary key. APIC internal item ID (10-digit numeric string)
itm_lno TEXT Line number (used with itm_iid for multi-line items)
itm_title TEXT Product title/name
itm_composer TEXT Composer (print music) or secondary description
itm_publisher TEXT Publisher or brand name
itm_alt TEXT Alternate/additional title
itm_sub_dept TEXT Sub-department code (FK → ap4sub.sub_sub_id)
itm_status TEXT Status flag: OLD, NEW, N, or NULL
itm_titles_in_alb TEXT Number of titles (print music albums)
itm_new_retail_price REAL RRP including GST (AUD)
itm_color TEXT Colour/variant description
itm_supplier_id TEXT Supplier code (FK → ap4rsp.rsp_supplier_id), e.g. FENDER, ROLAND, YAMAHA
itm_supplier_iid TEXT Supplier's own product code/SKU
itm_input_date TEXT Date product was added to APIC
itm_last_act_date TEXT Date of last activity/update
itm_barcode_flag TEXT Whether product has a barcode assigned
itm_barcode TEXT APIC-assigned barcode (usually 930-prefix — not a real EAN/UPC)
itm_brief TEXT Brief description
itm_instrument TEXT Instrument category code (FK → ap4ins.ins_id)
itm_grade TEXT AMEB grade code (FK → ap4rgr.rgr_grade)
itm_list TEXT Series/list code (FK → ap4rls.rls_list)
itm_pic_flag TEXT Whether an image exists for this item
itm_prod_type TEXT Product type flag
itm_trade_id TEXT Trade pricing tier

Status values:

Status Count Meaning
OLD 516,045 Established products, not recently added
N 103,317 (Unclear — possibly inactive/discontinued)
NULL 39,375 No status set
NEW 12,305 Recently added products

ap4rsp — Supplier Directory (238 suppliers)

Column Type Description
rsp_supplier_id TEXT Primary key. Short supplier code (e.g. FENDER, ROLAND, YAMAHA, AMS)
rsp_name TEXT Full supplier/distributor name
rsp_address_1/2, rsp_city, rsp_country, rsp_post_code TEXT Address fields
rsp_tel, rsp_fax TEXT Contact numbers
rsp_supplier_type TEXT Supplier classification
rsp_retail_factor REAL Retail price multiplier
rsp_disc_1 through rsp_disc_6 REAL Discount tier percentages
rsp_abn_no TEXT Australian Business Number

ap4ins — Instrument Category Codes (140 codes)

Column Type Description
ins_id TEXT 3-character code (e.g. AGM = Acoustic Guitars Musical, KEM = Keyboards Musical)
ins_desc TEXT Full description
ins_type TEXT MUSICAL or PRINT

ap4sub — Sub-Department Codes (1,019 codes)

Column Type Description
sub_id TEXT Parent instrument code (FK → ap4ins.ins_id)
sub_sub_id TEXT Sub-department code (e.g. AGM01 = Acoustic Guitar Cases)
sub_desc TEXT Description

item_images — Product Images (288,117 rows)

Column Type Description
item_id TEXT FK → ap4itm.itm_iid
item_ln_no INTEGER Image sequence number (multiple images per product)
image_filename TEXT Filename (e.g. GP189.JPG). Images are distributed separately via FTP.

item_specs — Product Specifications (108,251 rows)

Column Type Description
item_id TEXT FK → ap4itm.itm_iid
item_spec TEXT Free-text specification/description. Varies from a single sentence to detailed content. Example: "By Boris Berlin. For Piano. Music for the Beginner. Early Elementary. Book. 48 pages. Published by The Frederick Harris Music Company"

item_sa — Stock Availability (494,680 rows)

Column Type Description
item_id TEXT FK → ap4itm.itm_iid
stock_available TEXT Y (in stock at distributor), N (out of stock)
last_mod_date TEXT Timestamp of last stock status change
item_eta TEXT Expected arrival date (1900-01-01 = unknown/not set)

item_measure — Physical Measurements & Barcodes (46,984 rows)

Column Type Description
item_id TEXT FK → ap4itm.itm_iid
item_mpq INTEGER Minimum pack quantity
item_box_qty INTEGER Box/carton quantity
item_cbm REAL Cubic metres
item_weight REAL Weight (kg)
item_length REAL Length
item_width REAL Width
item_depth REAL Depth
item_mpn TEXT Manufacturer Part Number
item_upc TEXT UPC barcode
item_ean TEXT EAN barcode
item_jan TEXT JAN barcode (Japanese)
item_isbn TEXT ISBN (books/print music)

Coverage note: Only 47K of 671K products have an item_measure record. Of those, only ~10,500 have weight populated and ~10,150 have dimensions.

APIC Barcode System

APIC assigns its own internal barcodes with a 930 prefix (e.g. 9301100032826). These are not real EAN/UPC barcodes — they are APIC-internal identifiers used to link products across the system. Real EAN/UPC barcodes, where available, are stored in the item_measure table (item_ean, item_upc columns), but coverage is low (~7% of products).

Key Relationships

ap4itm.itm_iid ──────┬── item_images.item_id    (product images)
                      ├── item_specs.item_id     (specifications)
                      ├── item_sa.item_id        (stock availability)
                      ├── item_measure.item_id   (weight/dims/barcodes)
                      ├── item_pdf.item_id       (product PDFs)
                      └── item_mp.item_id        (media files)

ap4itm.itm_supplier_id ── ap4rsp.rsp_supplier_id (supplier details)
ap4itm.itm_instrument ─── ap4ins.ins_id          (instrument category)
ap4itm.itm_sub_dept ───── ap4sub.sub_sub_id      (sub-department)
ap4itm.itm_grade ──────── ap4rgr.rgr_grade       (AMEB grade)
ap4itm.itm_list ───────── ap4rls.rls_list        (series/list)

Linking APIC to Cin7

Two methods for matching APIC products to Cin7 inventory:

  1. APIC Barcode → Cin7 AdditionalAttribute10: Cin7 stores the APIC 930-prefix barcode in AdditionalAttribute10. Match against ap4itm.itm_barcode. This is the most reliable method when the APIC barcode has been recorded in Cin7.

  2. Supplier Product Code → APIC Supplier Item IID: Cin7's SupplierProductCode field matches ap4itm.itm_supplier_iid (the supplier's own SKU). This requires also matching the supplier to avoid false positives where different suppliers reuse the same SKU.

Data Quality Assessment

Assessed April 2026 against a Cin7 inventory of 68,100 products, a Roland dealer portal export (1,919 SKUs), and a Fender dealer portal export (4,840 SKUs).

Coverage

Comparison Match Rate
Cin7 inventory (68,100 products) 48,560 matched (71.3%)
Roland dealer portal (1,919 SKUs) 1,863 matched (97.1%)
Fender dealer portal (4,840 SKUs) 4,831 matched (99.8%)

Of the 19,540 unmatched Cin7 items, only 559 have an APIC-style 930 barcode (should be matchable but weren't linked). The remaining ~19,000 are products from brands/categories not in the APIC system at all.

Pricing Accuracy

Comparison Exact Match APIC Higher APIC Lower
Cin7 vs APIC (48,386 with prices) 37,774 (78.1%) 8,260 (17.1%) 2,352 (4.9%)
Roland vs APIC 85.2% within $1
Fender vs APIC 99.7% exact

Price difference breakdown (Cin7 vs APIC):

Difference Range APIC Higher APIC Lower
Under 5% 558 1,018
5–10% 1,020 425
10–20% 1,482 396
20–50% 2,887 354
50%+ 2,120 159

The ~5,400 items where APIC is 10%+ higher are likely stale Cin7 prices that need updating.

Enrichment Value by Data Type

Data Type APIC Coverage (of matched items) Enrichment Value
RRP Pricing 99.6% High — reliable for price validation and stale-price detection
Product Title 100% Moderate — can serve as product descriptions where none exist
Image References 69.7% (33,847 items) Moderate — filenames only; actual images distributed separately via FTP
Specifications 49.9% (24,221 items) Moderate — free-text, varies in quality and detail
Stock Availability 76.4% (37,081 items) Moderate — simple Y/N flag with timestamp
Weight 7.5% (3,659 items) Low — very sparse coverage
Dimensions 6.9% (3,370 items) Low — very sparse coverage
EAN/UPC Barcodes 6.9% (3,333 items) Low — very sparse coverage
Detailed Specs N/A Not comparable — supplier portals provide dramatically richer structured data (e.g. Fender: 166 spec columns, Roland: 26 columns) vs APIC's single free-text field

Top Suppliers by Matched Volume

APIC Supplier ID Matched Items
PRINT M 9,067
AMS 5,316
DYNAMIC 4,393
FENDER 4,181
AUSTRAL 4,071
DADDAUS 2,902
ALFRED 2,160
PRO 2,061
YAMAHA 1,713
ELECTRI 1,673
ROLAND 1,207
CMI 1,084

Freshness

The APIC extract assessed was from early-to-mid 2025. Of 48,560 matched items, 92% are marked status OLD. The itm_last_act_date field shows activity dates ranging from the 2000s through March 2025.

Summary Verdict

APIC is a reliable reference for product existence and RRP pricing across the major Australian music distributors. It covers ~71% of a typical music retailer's catalogue and its pricing is accurate enough to use as a sanity check and stale-price detector.

It is not a replacement for supplier portal data when it comes to rich product content (descriptions, detailed specifications, marketing images, dimensions, weight). Supplier portals provide dramatically richer data — Fender alone offers 166 structured specification columns versus APIC's single free-text field.

Best uses for a PDM system:

  • Product master list / catalogue backbone — "does this product exist and what's the current RRP?"
  • Price change detection — flag items where APIC RRP differs from the system of record by more than a threshold
  • New product discovery — monitor for newly added APIC items
  • Basic product categorisation via instrument/sub-department codes
  • Supplier product code cross-reference (itm_supplier_iid ↔ supplier SKU)