diff --git a/.gitignore b/.gitignore index 62c8935..b74989a 100644 --- a/.gitignore +++ b/.gitignore @@ -1 +1,6 @@ -.idea/ \ No newline at end of file +.idea/ +.roo* +.aider* +AGENT* +*.csv +*.iml \ No newline at end of file diff --git a/QUICK_GUIDE.md b/QUICK_GUIDE.md new file mode 100644 index 0000000..03c171f --- /dev/null +++ b/QUICK_GUIDE.md @@ -0,0 +1,159 @@ +# Python API (alpha) - Quick Guide + +## What is it? +The Python API gives you the ability to create your own custom Bookmap indicators using Python directly within Bookmap, without the need to use an external IDE or a code editor. + +Needed is only a basic knowledge of Python and a clear idea of what you want to build. + +See [Python API GitHub repository](https://github.com/bookmap-public/python-api) for documentation and code examples. + +**Note:** External code editors / IDEs are still supported. Feel free to use them. You can find the [bookmap library on PyPI](https://pypi.org/project/bookmap/). + +## What can I do with it? +The ultimate purpose of Python API is to provide the full functionality of Simplified Bookmap API, which allows creation of indicators and trading strategies. But it is still in active development and not all features are complete. + +### Supported features: + +* Draw custom indicators +* Basic user settings (numbers, strings, checkboxes, colors), including persisting them across sessions +* Receive order book and trade events +* Receive market-by-order (MBO) events +* Trading (send/update/remove orders) +* Receiving user’s position / balance information +* Receive historical data +* Receive bar events (candlestick) - *can be computed manually* +* Receive aggregate price levels - *can be computed manually* +* Advanced user settings + +## Example Addon +Here’s an example Python API addon: a copy of our Java Liquidity Tracker addon. The Python variant is functionally the same as the Java one. + +Number of trackable price levels is configurable. + +## Community +Please join our `#python-api` Discord channel. + +## How can I start? +To begin with development only three things are required: +1. Python >=3.6 installed on your machine - [get it here](https://www.python.org/downloads/). +2. Bookmap Python API addon used to build and edit code. To do so please follow these steps. + * Open Bookmap in any mode + * Subscribe to any instrument + * Click on the “Configure add-ons” icon + * Go to “Manage add-ons” + * Click on “Install” button near Python API add-on + +Afterward, open your Bookmap, select a favorite adapter and subscribe to some instrument you want to work with. + +**Note:** Since this product is evolving rapidly, the guide below may be slightly outdated. Please note us of any inconsistencies in the `#python-api` Discord channel. + +To run your first addon go through the below steps: +* Click on Settings. +* Click on Configure addons. +* Click Add… +* Select the previously downloaded Python Bookmap API Addon JAR file. +* Check that the addon was attached correctly and the window with addons contains ‘Python API’. Enable the addon to activate the editor. + +Click on ‘Open embedded editor’ and to view this editor window: +The left part is the menu with a list of scripts. The right part will contain the code editor. +On the top of the right part, there are three buttons: +* **Save** - Saves the current file. +* **Build** - Creates the addon from your Python addon script. The result is a JAR file in the build directory (File -> Open build folder). +* **Set custom runtime** - Select the custom runtime being used to run your script. By default it uses the `python3` command. If you want to use a virtual environment (with custom libraries), or perhaps some Anaconda runtime, select its Python binary here. + +## Your first addon +To understand the process of receiving a working addon, here are the steps for writing a hello world example. + +**Note:** You can also use an external code editor such as Visual Studio Code. + +1. Create a file. In order to do this, right click on the left part of the window and select New Python file, enter the name and click OK. + +2. Let’s import the Bookmap Python library to this file. No need to install it manually, it is already installed. To do this, click on the newly-created file and add `import pyl1api as bm` line. + + Every script should import this library. We gave it the ‘bm’ alias to make its usage shorter. + +3. Your addon will receive events via handlers (for example notification about new trade or newly subscribed instruments) via so-called handler functions (also called callbacks) and your Python code is responsible for handling these events. There is no need to handle every possible event, just the ones your addon needs. + + The two mandatory events that have to be handled are: + * subscription to a new instrument (appearing when you subscribe to a new instrument in Bookmap), + * detaching the instrument (appearing when you unsubscribe from the instrument). + + When you define a handler function, make sure to specify the exact parameters, as specified [here](https://github.com/bookmap-public/python-api) or in the reference API. + +4. Let’s add these two mandatory handlers. Here’s an example (you can copy-paste): + ```python + def handle_subscribe_instrument(addon, alias, full_name, is_crypto, pips, size_multiplier, instrument_multiplier): + print("Subscribing to instrument " + alias, flush=True) + + def handle_unsubscribe_instrument(addon, alias): + print("Unsubscribing from instrument " + alias, flush=True) + ``` + +5. To actually use these handlers, we need to create the addon object and start the addon, passing in the handler functions. Add below code to your file. + + ```python + # This ‘if’ is not strictly needed here, but it’s a useful convention. + if __name__ == "__main__": + # Create an addon object. + addon = bm.create_addon() + + # Start the addon. We pass in the addon object, and the two handler function names. + bm.start_addon(addon, handle_subscribe_instrument, handle_unsubscribe_instrument) + + # Then we wait until the addon is turned off. + bm.wait_until_addon_is_turned_off(addon) + ``` + +The final result should look like this: + +Finally it is time to build your script. Click on the ‘Build’ button. + +**Note** that your build may fail if Python 3 runtime isn’t detectable by default. In that case, click “Set custom runtime” and select the Python 3 binary. If you don’t have Python yet, download it from the [Python website](https://www.python.org/). + +After a successful build, click File -> Open build folder to find your addon JAR. We will use this file to run the addon from Bookmap. + +The next step is to add the newly built addon JAR file to Bookmap. In order to do this, return to the main chart where you have subscribed to the instrument, click again on Settings -> Configure addons. Now just add the above created JAR file by clicking on the ‘add…’ button and select it from the build folder. + +**Note:** In the future, we will improve this process so there’s no need to select the JAR file from the file system every time an addon is built. + +Finally, let's enable it by clicking on the checkbox near the name… And, we do not see anything. Why? + +Our addon only prints a log when adding/removing an instrument. To see this, open File -> Show log file in the main Bookmap window. All messages printed from python are marked with the `[PYTHON-CLIENT]` category. + +We created a first addon that doesn’t do anything particularly useful but we learned what the entire Python API addon development process is. + +## More examples +There are more examples showing a wider set of functionalities than the hello world script. You can find all examples in our [Python API Github repository](https://github.com/bookmap-public/python-api). + +* `cvd_addon.py` - provides python implementation of CVD addon, good addon to start, shows how to subscribe to trades and create an indicator for each instrument. +* `order_book_test.py` - shows an example of using order book utility objects. This entity helps to operate with depth data, organising it to an order book. It might be useful to have such an entity for your addon. The example also shows how to subscribe to depth and handle it. +* `liquidity_tracker.py` - is a script which develops a liquidity tracker indicator. It subscribes to depth data and registers an indicator for each side (bid and ask) drawing liquidity according to received depth data. Moreover, it provides UI configurations allowing to configure a number of price levels tracked for liquidity. Minimal value is set to 5, maximal value is set to 25. +* `simple_market_maker.py` - is a trading strategy script that use simple algorithm to trade. It places 2 orders (buy and sell) below and above current best bid and best ask with a shift that you can specify in addon settings (`Price offset`) and every x seconds (`Interval` in addon settings) does the following: if a price changed and order wasn’t executed - then move order to adjust shift from best bid or best ask (depends on the order side). If a price changed and order was executed then place another order with the same shift from current best bid/ask. Threshold value in settings allows you to specify how big should be your position delta to stop placing buy or sell orders. For example: if your threshold is 5 and you already have a position bigger or equal to 5 then buy orders wouldn’t placed, the strategy will send only sell orders until your position became less then 5. The same works to the opposite side: if your position is -5 then sell orders wouldn’t placed. + +## FAQ +**Can I write a trading strategy with this addon?** +Yes. Please see our [Github repository](https://github.com/bookmap-public/python-api) for documentation and examples. + +**Can I develop integration with exchanges I am interested in?** +No, the Python API is intended for addons. Please use Java (with Bookmap’s L0 API) to write your custom exchange integrations. + +**I developed my addon and it crashed when I started it. Is this a bug? What should I do?** +It might be a bug, but it also might be an issue with your Python script. If it fails for any reason (e.g. syntax error), then Bookmap won’t be able to communicate with it properly and will consider it as a critical error. You can open Bookmap logs and check for any errors from Python. All errors/logs are redirected to Bookmap with a special category `[PYTHON-CLIENT]`, just search for this category. If you do not see the reason for the issue, please report it to the Discord channel. + +**I do not like your editor, can I use my favorite VS code/Sublime/Pycharm?** +Yes. Just use `/Python` directory as your workspace in your favorite editor and you can edit code from it. You can download [bookmap package at PyPI](https://pypi.org/project/bookmap/). + +**Can I get a reference to your library so autocompletion in my editor works properly?** +Yes, if you use any editor that supports autocompletion you can do it with [bookmap package downloaded from PyPI](https://pypi.org/project/bookmap/). For now, embedded editor doesn’t support autocompletion. + +**Where can I get the API reference?** +See the [Python API GitHub repository](https://github.com/bookmap-public/python-api). + +**Can I request support of a feature I am dreaming about?** +Yes, ask about it in the Discord channel. + +**Can I use Python 2?** +No, we only support Python 3. + +**Can I distribute these addons e.g. to my friends?** +Yes. A safer method is to send a script and ask them to build it themselves, since some configuration you did for your version might not be suitable for their version e.g. Python runtime path or TCP port. Obviously, their Python runtime path can differ from yours. You can set all config values automatically and hope that nothing will be broken. In this case, distribution of the built jar is possible. \ No newline at end of file diff --git a/examples/bnb_walls_snapshot.py b/examples/bnb_walls_snapshot.py new file mode 100644 index 0000000..e4c5447 --- /dev/null +++ b/examples/bnb_walls_snapshot.py @@ -0,0 +1,346 @@ +""" +BNB Walls Snapshot - Immediate Order Book Liquidity Snapshot Exporter + +This Bookmap addon captures a single snapshot of BNB/USDT order book liquidity (supports BNBUSDT and BNB-USDT formats) +immediately after subscription and exports to a user-configured CSV file. + +Features: +- One-time snapshot capture (no periodic exports) +- Monitors bid/ask liquidity across 1,701 price buckets (300 to 2,000 USD) +- Each bucket represents a $1 range at exact integer price levels +- Configurable CSV output path via settings UI +- Exports immediately after order book snapshot is received +- Default export directory: /Users/dannyalmaden/python_projects/python-api-bookmap/data +- Default file format: bnbusdt_snapshot.csv + +Bucket Behavior: +- Bucket 300: Aggregates orders from $300.00 to $300.99 +- Bucket 500: Aggregates orders from $500.00 to $500.99 +- Bucket 800: Aggregates orders from $800.00 to $800.99 +- Orders at intermediate prices (e.g., $800.50) are NOT included in any bucket + +CSV Format: + price,bid_bnb_qty,ask_bnb_qty + 300,155.7,12.3 + 301,150.3,15.8 + ... + +Author: Bookmap Python API +""" + +import bookmap as bm +import csv +import threading +import time +import os +from pathlib import Path +from collections import defaultdict +from datetime import datetime + +# Global data structures +instrument_data = {} + +# Configuration constants +BUCKET_SIZE = 1 # USD +MIN_BUCKET = 300 # USD +MAX_BUCKET = 2000 # USD +SNAPSHOT_DELAY = 3.0 # seconds to wait for snapshot completion after subscription + + +def handle_subscribe_instrument(addon, alias, full_name, is_crypto, pips, size_multiplier, instrument_multiplier, supported_features): + """ + Called when an instrument is subscribed in Bookmap. + Only processes instruments containing 'BNBUSDT' or 'BNB-USDT' in their alias. + + Args: + addon: The addon instance + alias: Instrument alias/symbol + full_name: Full instrument name + is_crypto: Boolean indicating if this is a crypto instrument + pips: Price tick size multiplier + size_multiplier: Size/volume multiplier for converting to actual quantities + instrument_multiplier: Additional instrument multiplier + supported_features: Features supported by this instrument + """ + # Filter: Only process BNB/USDT instruments (supports both BNBUSDT and BNB-USDT formats) + alias_upper = alias.upper() + is_bnb_usdt = "BNBUSDT" in alias_upper or "BNB-USDT" in alias_upper + if not is_bnb_usdt: + print(f"Skipping non-BNB/USDT instrument: {alias}", flush=True) + return + + print(f"Subscribing to BNB Walls Snapshot for: {alias}", flush=True) + + # Initialize instrument data structure with timestamped CSV path + csv_path_default = f"/Users/dannyalmaden/python_projects/python-api-bookmap/data/bnbusdt_snapshot.csv" + + instrument_data[alias] = { + "full_name": full_name, + "is_crypto": is_crypto, + "pips": pips, + "size_multiplier": size_multiplier, + "instrument_multiplier": instrument_multiplier, + "order_book": bm.create_order_book(), + "csv_path": csv_path_default, + "snapshot_received": False, + "snapshot_timer": None + } + + # Add settings parameter for CSV output path + bm.add_string_settings_parameter(addon, alias, "CSV Output Path", csv_path_default) + + # Subscribe to depth data + req_id = hash(alias) % 1000000 # Generate unique request ID + bm.subscribe_to_depth(addon, alias, req_id) + + # Start timer to export snapshot after depth subscription + timer = threading.Timer(SNAPSHOT_DELAY, export_snapshot_to_csv, args=[addon, alias]) + timer.daemon = True + timer.start() + instrument_data[alias]["snapshot_timer"] = timer + + print(f"Successfully subscribed to {alias} - Snapshot will export in {SNAPSHOT_DELAY}s", flush=True) + print(f"Snapshot will be exported to: {csv_path_default}", flush=True) + + +def handle_unsubscribe_instrument(addon, alias): + """ + Called when an instrument is unsubscribed. + Cleans up resources and cancels timers. + + Args: + addon: The addon instance + alias: Instrument alias being unsubscribed + """ + print(f"Unsubscribing from: {alias}", flush=True) + + # Cancel snapshot timer if exists + if alias in instrument_data and instrument_data[alias]["snapshot_timer"] is not None: + instrument_data[alias]["snapshot_timer"].cancel() + + # Clean up data structures + if alias in instrument_data: + del instrument_data[alias] + + + print(f"Cleanup complete for: {alias}", flush=True) + + +def on_depth(addon, alias, is_bid, price_level, size_level): + """ + Handler for depth updates. Updates the order book with incoming depth data. + + Args: + addon: The addon instance + alias: Instrument alias + is_bid: True for bid/buy side, False for ask/sell side + price_level: Integer price level (must be multiplied by pips) + size_level: Integer size level (must be divided by size_multiplier) + """ + # Skip if instrument not tracked + if alias not in instrument_data: + return + + # Update internal order book + order_book = instrument_data[alias]["order_book"] + bm.on_depth(order_book, is_bid, price_level, size_level) + + + +def on_settings_change(addon, alias, setting_name, field_type, new_value): + """ + Handler for settings parameter changes. Updates CSV path when user changes it. + + Args: + addon: The addon instance + alias: Instrument alias + setting_name: Name of the setting that changed + field_type: Type of the field (string, int, etc.) + new_value: New value of the setting + """ + if setting_name == "CSV Output Path" and alias in instrument_data: + old_path = instrument_data[alias]["csv_path"] + instrument_data[alias]["csv_path"] = new_value + print(f"CSV output path updated for {alias}:", flush=True) + print(f" Old: {old_path}", flush=True) + print(f" New: {new_value}", flush=True) + + +def recalculate_liquidity_buckets(alias, order_book, pips, size_multiplier): + """ + Calculates aggregated liquidity for all price buckets by scanning + the entire order book. + + Each bucket only aggregates orders within a $1 range at exact integer price levels. + For example: + - Bucket 800: Only includes orders from $800.00 to $800.99 + - Bucket 801: Only includes orders from $801.00 to $801.99 + - Orders at intermediate prices (e.g., $800.50) are excluded from all buckets + + Args: + alias: Instrument alias + order_book: Order book dictionary with 'bids' and 'asks' SortedDicts + pips: Price multiplier for converting price levels to actual prices + size_multiplier: Divisor for converting size levels to actual BNB quantities + + Returns: + Dictionary mapping bucket prices to {'bid': float, 'ask': float} + """ + # Initialize all buckets to zero + liquidity_data = {} + for bucket in range(MIN_BUCKET, MAX_BUCKET + 1, BUCKET_SIZE): + liquidity_data[bucket] = {"bid": 0.0, "ask": 0.0} + + # Aggregate bid liquidity + for price_level, size_level in order_book["bids"].items(): + actual_price = price_level * pips + bucket = int(actual_price // BUCKET_SIZE) * BUCKET_SIZE + + # Only include orders where the integer price matches the bucket exactly + # e.g., for bucket 800, only prices from 800.00 to 800.99 + if MIN_BUCKET <= bucket <= MAX_BUCKET and size_level > 0: + # Check if price is within $1 of the bucket (bucket to bucket+0.99) + if bucket <= actual_price < bucket + 1: + bnb_qty = size_level / size_multiplier + liquidity_data[bucket]["bid"] += bnb_qty + + # Aggregate ask liquidity + for price_level, size_level in order_book["asks"].items(): + actual_price = price_level * pips + bucket = int(actual_price // BUCKET_SIZE) * BUCKET_SIZE + + # Only include orders where the integer price matches the bucket exactly + # e.g., for bucket 800, only prices from 800.00 to 800.99 + if MIN_BUCKET <= bucket <= MAX_BUCKET and size_level > 0: + # Check if price is within $1 of the bucket (bucket to bucket+0.99) + if bucket <= actual_price < bucket + 1: + bnb_qty = size_level / size_multiplier + liquidity_data[bucket]["ask"] += bnb_qty + + return liquidity_data + + +def export_snapshot_to_csv(addon, alias): + """ + Exports order book snapshot to CSV file. Only exports once per subscription. + + Args: + addon: The addon instance + alias: Instrument alias + """ + if alias not in instrument_data: + print(f"No instrument data available for {alias}", flush=True) + return + + # Check if already exported + if instrument_data[alias]["snapshot_received"]: + print(f"Snapshot already exported for {alias}", flush=True) + return + + # Get instrument data + csv_path = instrument_data[alias]["csv_path"] + order_book = instrument_data[alias]["order_book"] + pips = instrument_data[alias]["pips"] + size_multiplier = instrument_data[alias]["size_multiplier"] + + # Create the data directory if it doesn't exist + csv_dir = os.path.dirname(csv_path) + if csv_dir and not os.path.exists(csv_dir): + try: + os.makedirs(csv_dir, exist_ok=True) + print(f"Created directory: {csv_dir}", flush=True) + except Exception as e: + print(f"ERROR: Failed to create output directory {csv_dir}: {e}", flush=True) + return + elif csv_dir: + print(f"Output directory: {csv_dir}", flush=True) + + # Check if order book has data + bid_count = len(order_book["bids"]) + ask_count = len(order_book["asks"]) + + print(f"Capturing snapshot for {alias}:", flush=True) + print(f" Order book depth: {bid_count} bids, {ask_count} asks", flush=True) + + if bid_count == 0 and ask_count == 0: + print(f"WARNING: Order book is empty for {alias}. Snapshot may not be complete.", flush=True) + + try: + # Calculate aggregated liquidity buckets + liquidity_data = recalculate_liquidity_buckets(alias, order_book, pips, size_multiplier) + + # Write to CSV + with open(csv_path, 'w', newline='') as csvfile: + writer = csv.writer(csvfile) + + # Write header + writer.writerow(['price', 'bid_bnb_qty', 'ask_bnb_qty']) + + # Write data for all buckets in order + for bucket in range(MIN_BUCKET, MAX_BUCKET + 1, BUCKET_SIZE): + bid_qty = liquidity_data.get(bucket, {}).get("bid", 0.0) + ask_qty = liquidity_data.get(bucket, {}).get("ask", 0.0) + + # Format quantities to 4 decimal places + writer.writerow([bucket, f"{bid_qty:.4f}", f"{ask_qty:.4f}"]) + + # Mark snapshot as received + instrument_data[alias]["snapshot_received"] = True + + print(f"✓ Snapshot successfully exported to: {csv_path}", flush=True) + print(f" Total buckets: {len(liquidity_data)}", flush=True) + + # Calculate total liquidity for summary + total_bid_bnb = sum(data["bid"] for data in liquidity_data.values()) + total_ask_bnb = sum(data["ask"] for data in liquidity_data.values()) + print(f" Total bid liquidity: {total_bid_bnb:.4f} BNB", flush=True) + print(f" Total ask liquidity: {total_ask_bnb:.4f} BNB", flush=True) + + except PermissionError as e: + print(f"ERROR: Permission denied writing to {csv_path}", flush=True) + print(f" Please check file permissions or change the CSV path in settings.", flush=True) + except OSError as e: + print(f"ERROR: Invalid file path {csv_path}", flush=True) + print(f" {str(e)}", flush=True) + print(f" Please update the CSV path in addon settings.", flush=True) + except Exception as e: + print(f"ERROR: Failed to export snapshot for {alias}: {str(e)}", flush=True) + + +if __name__ == "__main__": + print("=" * 70, flush=True) + print("BNB Walls Snapshot - Immediate Order Book Liquidity Exporter", flush=True) + print("=" * 70, flush=True) + print(f"Configuration:", flush=True) + print(f" Price range: {MIN_BUCKET:,} - {MAX_BUCKET:,} USD", flush=True) + print(f" Bucket increment: {BUCKET_SIZE} USD", flush=True) + print(f" Aggregation range: $1 per bucket (exact integer prices)", flush=True) + print(f" Total buckets: {(MAX_BUCKET - MIN_BUCKET) // BUCKET_SIZE + 1}", flush=True) + print(f" Snapshot delay: {SNAPSHOT_DELAY} seconds", flush=True) + print("", flush=True) + print("Behavior:", flush=True) + print(" • Only tracks instruments containing 'BNBUSDT' or 'BNB-USDT'", flush=True) + print(" • Captures ONE snapshot per subscription", flush=True) + print(" • Exports to CSV immediately after snapshot received", flush=True) + print(" • CSV path configurable via addon settings", flush=True) + print("=" * 70, flush=True) + + # Create addon instance + addon = bm.create_addon() + + # Register handlers + bm.add_depth_handler(addon, on_depth) + bm.add_on_setting_change_handler(addon, on_settings_change) + + # Start addon with subscribe/unsubscribe handlers + bm.start_addon(addon, handle_subscribe_instrument, handle_unsubscribe_instrument) + + print("BNB Walls Snapshot addon started successfully!", flush=True) + print("Waiting for BNBUSDT instruments to be subscribed...", flush=True) + print("", flush=True) + + # Wait until addon is turned off + bm.wait_until_addon_is_turned_off(addon) + + print("", flush=True) + print("BNB Walls Snapshot addon stopped.", flush=True) \ No newline at end of file diff --git a/examples/btc_walls.py b/examples/btc_walls.py new file mode 100644 index 0000000..0b2c046 --- /dev/null +++ b/examples/btc_walls.py @@ -0,0 +1,327 @@ +""" +BTC Walls - Order Book Liquidity Tracker + +This Bookmap addon tracks order book liquidity for BTCUSDT instruments and exports +aggregated data to CSV files every 10 minutes. + +Features: +- Monitors bid/ask liquidity across 91 price buckets (30,000 to 120,000 USD) +- Each bucket represents a 1,000 USD range +- Aggregates total BTC quantities in each bucket +- Exports to CSV every 10 minutes with timestamp + +CSV Format: + price,bid_btc_qty,ask_btc_qty + 30000,155.7,12.3 + 31000,150.3,15.8 + ... + +Author: Bookmap Python API +""" + +import bookmap as bm +import csv +import threading +import os +from pathlib import Path +from datetime import datetime +from collections import defaultdict + +# Global data structures +alias_to_order_book = {} +alias_to_instrument = {} +alias_to_export_timer = {} +alias_to_liquidity_data = {} + +# Configuration constants +BUCKET_SIZE = 1000 # USD +MIN_BUCKET = 30000 # USD +MAX_BUCKET = 120000 # USD +EXPORT_INTERVAL = 600 # seconds (10 minutes) + +# Safe default output directory (user's home directory is always writable) +DEFAULT_OUTPUT_DIR = str(Path.home() / "btc_walls_exports") + + +def handle_subscribe_instrument(addon, alias, full_name, is_crypto, pips, size_multiplier, instrument_multiplier, supported_features): + """ + Called when an instrument is subscribed in Bookmap. + Only processes instruments containing 'BTCUSDT' in their alias. + + Args: + addon: The addon instance + alias: Instrument alias/symbol + full_name: Full instrument name + is_crypto: Boolean indicating if this is a crypto instrument + pips: Price tick size multiplier + size_multiplier: Size/volume multiplier for converting to actual quantities + instrument_multiplier: Additional instrument multiplier + supported_features: Features supported by this instrument + """ + # Filter: Only process BTCUSDT instruments + if "BTCUSDT" not in alias: + print(f"Skipping non-BTCUSDT instrument: {alias}", flush=True) + return + + print(f"Subscribing to BTC Walls tracker for: {alias}", flush=True) + + # Store instrument metadata + instrument = { + "alias": alias, + "full_name": full_name, + "is_crypto": is_crypto, + "pips": pips, + "size_multiplier": size_multiplier, + "instrument_multiplier": instrument_multiplier, + "output_dir": DEFAULT_OUTPUT_DIR # Store output directory path + } + alias_to_instrument[alias] = instrument + + # Add settings parameter for configurable output directory + bm.add_string_settings_parameter(addon, alias, "Output Directory", DEFAULT_OUTPUT_DIR) + + # Create order book for tracking depth updates + alias_to_order_book[alias] = bm.create_order_book() + + # Initialize liquidity data structure for this instrument + # Structure: {bucket_price: {"bid": btc_qty, "ask": btc_qty}} + alias_to_liquidity_data[alias] = {} + + # Subscribe to depth data + req_id = hash(alias) % 1000000 # Generate unique request ID + bm.subscribe_to_depth(addon, alias, req_id) + + # Start periodic CSV export timer + start_export_timer(addon, alias) + + print(f"Successfully subscribed to {alias} - CSV export every {EXPORT_INTERVAL}s", flush=True) + print(f"Output directory: {DEFAULT_OUTPUT_DIR}", flush=True) + + +def handle_unsubscribe_instrument(addon, alias): + """ + Called when an instrument is unsubscribed. + Cleans up resources and cancels timers. + + Args: + addon: The addon instance + alias: Instrument alias being unsubscribed + """ + print(f"Unsubscribing from: {alias}", flush=True) + + # Cancel export timer if exists + if alias in alias_to_export_timer: + alias_to_export_timer[alias].cancel() + del alias_to_export_timer[alias] + + # Clean up data structures + if alias in alias_to_order_book: + del alias_to_order_book[alias] + if alias in alias_to_instrument: + del alias_to_instrument[alias] + if alias in alias_to_liquidity_data: + del alias_to_liquidity_data[alias] + + print(f"Cleanup complete for: {alias}", flush=True) + + +def on_depth(addon, alias, is_bid, price_level, size_level): + """ + Handler for depth updates. Updates the order book and aggregates liquidity + into price buckets. + + Args: + addon: The addon instance + alias: Instrument alias + is_bid: True for bid/buy side, False for ask/sell side + price_level: Integer price level (must be multiplied by pips) + size_level: Integer size level (must be divided by size_multiplier) + """ + # Skip if instrument not tracked + if alias not in alias_to_order_book or alias not in alias_to_instrument: + return + + # Update internal order book + order_book = alias_to_order_book[alias] + bm.on_depth(order_book, is_bid, price_level, size_level) + + # Get instrument metadata for conversions + instrument = alias_to_instrument[alias] + pips = instrument["pips"] + size_multiplier = instrument["size_multiplier"] + + # Convert price_level to actual USD price + actual_price = price_level * pips + + # Determine which bucket this price falls into + bucket = int(actual_price // BUCKET_SIZE) * BUCKET_SIZE + + # Only track buckets within our range + if bucket < MIN_BUCKET or bucket > MAX_BUCKET: + return + + # Recalculate aggregated liquidity for all buckets + # This is necessary because depth updates can affect multiple levels + recalculate_liquidity_buckets(alias, order_book, pips, size_multiplier) + + +def recalculate_liquidity_buckets(alias, order_book, pips, size_multiplier): + """ + Recalculates aggregated liquidity for all price buckets by scanning + the entire order book. + + Args: + alias: Instrument alias + order_book: Order book dictionary with 'bids' and 'asks' SortedDicts + pips: Price multiplier for converting price levels to actual prices + size_multiplier: Divisor for converting size levels to actual BTC quantities + """ + # Initialize all buckets to zero + liquidity_data = {} + for bucket in range(MIN_BUCKET, MAX_BUCKET + 1, BUCKET_SIZE): + liquidity_data[bucket] = {"bid": 0.0, "ask": 0.0} + + # Aggregate bid liquidity + for price_level, size_level in order_book["bids"].items(): + actual_price = price_level * pips + bucket = int(actual_price // BUCKET_SIZE) * BUCKET_SIZE + + if MIN_BUCKET <= bucket <= MAX_BUCKET and size_level > 0: + btc_qty = size_level / size_multiplier + liquidity_data[bucket]["bid"] += btc_qty + + # Aggregate ask liquidity + for price_level, size_level in order_book["asks"].items(): + actual_price = price_level * pips + bucket = int(actual_price // BUCKET_SIZE) * BUCKET_SIZE + + if MIN_BUCKET <= bucket <= MAX_BUCKET and size_level > 0: + btc_qty = size_level / size_multiplier + liquidity_data[bucket]["ask"] += btc_qty + + # Update global liquidity data + alias_to_liquidity_data[alias] = liquidity_data + + +def export_to_csv(addon, alias): + """ + Exports aggregated liquidity data to a CSV file. + Creates a timestamped filename for each export. + + Args: + addon: The addon instance + alias: Instrument alias + """ + if alias not in alias_to_liquidity_data or alias not in alias_to_instrument: + print(f"No liquidity data available for {alias}", flush=True) + return + + # Get output directory from instrument data + output_dir = alias_to_instrument[alias]["output_dir"] + + # Create directory if it doesn't exist + try: + os.makedirs(output_dir, exist_ok=True) + except Exception as e: + print(f"ERROR: Failed to create output directory {output_dir}: {e}", flush=True) + # Restart timer even on failure + start_export_timer(addon, alias) + return + + # Generate timestamped filename + timestamp = datetime.now().strftime("%Y%m%d_%H%M%S") + filename = f"btc_walls_{alias}_{timestamp}.csv" + + # Build full path to CSV file + csv_path = os.path.join(output_dir, filename) + + try: + liquidity_data = alias_to_liquidity_data[alias] + + with open(csv_path, 'w', newline='') as csvfile: + writer = csv.writer(csvfile) + + # Write header + writer.writerow(['price', 'bid_btc_qty', 'ask_btc_qty']) + + # Write data for all buckets in order + for bucket in range(MIN_BUCKET, MAX_BUCKET + 1, BUCKET_SIZE): + bid_qty = liquidity_data.get(bucket, {}).get("bid", 0.0) + ask_qty = liquidity_data.get(bucket, {}).get("ask", 0.0) + + # Format quantities to 4 decimal places + writer.writerow([bucket, f"{bid_qty:.4f}", f"{ask_qty:.4f}"]) + + print(f"✓ Exported liquidity data to: {csv_path}", flush=True) + + except Exception as e: + print(f"ERROR: Failed to export CSV for {alias}: {str(e)}", flush=True) + + # Restart timer for next export + start_export_timer(addon, alias) + + +def start_export_timer(addon, alias): + """ + Starts a periodic timer to export CSV data every EXPORT_INTERVAL seconds. + + Args: + addon: The addon instance + alias: Instrument alias + """ + # Cancel existing timer if any + if alias in alias_to_export_timer: + alias_to_export_timer[alias].cancel() + + # Create new timer + timer = threading.Timer(EXPORT_INTERVAL, export_to_csv, args=[addon, alias]) + timer.daemon = True # Daemon thread will exit when main program exits + timer.start() + + alias_to_export_timer[alias] = timer + + +def on_settings_change(addon, alias, setting_name, field_type, new_value): + """ + Handler for settings parameter changes. Updates output directory when user changes it. + + Args: + addon: The addon instance + alias: Instrument alias + setting_name: Name of the setting that changed + field_type: Type of the field (string, int, etc.) + new_value: New value of the setting + """ + if setting_name == "Output Directory" and alias in alias_to_instrument: + old_dir = alias_to_instrument[alias]["output_dir"] + alias_to_instrument[alias]["output_dir"] = new_value + print(f"Output directory updated for {alias}:", flush=True) + print(f" Old: {old_dir}", flush=True) + print(f" New: {new_value}", flush=True) + + +if __name__ == "__main__": + print("Starting BTC Walls addon...", flush=True) + print(f"Configuration:", flush=True) + print(f" Price range: {MIN_BUCKET:,} - {MAX_BUCKET:,} USD", flush=True) + print(f" Bucket size: {BUCKET_SIZE:,} USD", flush=True) + print(f" Export interval: {EXPORT_INTERVAL} seconds", flush=True) + print(f" Total buckets: {(MAX_BUCKET - MIN_BUCKET) // BUCKET_SIZE + 1}", flush=True) + + # Create addon instance + addon = bm.create_addon() + + # Register handlers + bm.add_depth_handler(addon, on_depth) + bm.add_on_settings_change_handler(addon, on_settings_change) + + # Start addon with subscribe/unsubscribe handlers + bm.start_addon(addon, handle_subscribe_instrument, handle_unsubscribe_instrument) + + print("BTC Walls addon started successfully!", flush=True) + print("Waiting for BTCUSDT instruments to be subscribed...", flush=True) + + # Wait until addon is turned off + bm.wait_until_addon_is_turned_off(addon) + + print("BTC Walls addon stopped.", flush=True) \ No newline at end of file diff --git a/examples/btc_walls_readme.md b/examples/btc_walls_readme.md new file mode 100644 index 0000000..b502b85 --- /dev/null +++ b/examples/btc_walls_readme.md @@ -0,0 +1,431 @@ +# BTC Walls - Order Book Liquidity Tracker + +A Bookmap Python API addon that monitors BTCUSDT order book depth in real-time and exports aggregated liquidity data to CSV files every 10 minutes. This tool is designed for traders and analysts who need to track and analyze order book liquidity patterns over time. + +## Overview + +BTC Walls continuously tracks bid and ask liquidity across the BTCUSDT order book, aggregating order sizes into 1,000 USD price buckets spanning from $30,000 to $120,000. The aggregated data is automatically exported to timestamped CSV files at regular intervals, providing a historical record of liquidity distribution. + +**Primary Use Case:** Market depth analysis and liquidity tracking for cryptocurrency trading strategies, particularly for identifying support/resistance levels and "whale walls" in the Bitcoin market. + +## Features + +- **Real-time Order Book Monitoring**: Continuously tracks BTCUSDT depth data with automatic updates +- **Price Bucket Aggregation**: Organizes liquidity into 91 discrete 1,000 USD buckets for simplified analysis +- **Automated CSV Export**: Saves snapshots of liquidity data every 10 minutes with timestamps +- **Instrument Filtering**: Automatically filters to only process BTCUSDT instruments +- **Resource Management**: Proper cleanup when instruments are unsubscribed +- **Multi-instrument Support**: Can track multiple BTCUSDT instruments simultaneously (e.g., different exchanges) + +## How It Works + +### Algorithm Overview + +1. **Subscription**: When a BTCUSDT instrument is subscribed in Bookmap, the addon initializes tracking +2. **Depth Updates**: Each order book update triggers recalculation of liquidity buckets +3. **Aggregation**: Orders are grouped into 1,000 USD price buckets (e.g., 50,000-50,999 USD) +4. **Periodic Export**: Every 10 minutes, the current state is exported to a CSV file +5. **Continuous Operation**: The cycle repeats until the instrument is unsubscribed or addon is stopped + +### Price Bucket Aggregation Logic + +The addon divides the order book into fixed-size buckets: + +```python +bucket_price = int(actual_price // 1000) * 1000 +``` + +For example: +- Orders at $50,234 → Bucket 50,000 +- Orders at $50,789 → Bucket 50,000 +- Orders at $51,050 → Bucket 51,000 + +Within each bucket, BTC quantities are summed separately for bids and asks. + +### Data Collection Process + +1. **Order Book Updates** arrive via the [`on_depth()`](examples/btc_walls.py:118) handler +2. **Price Conversion**: `price_level × pips` converts internal price levels to USD +3. **Size Conversion**: `size_level ÷ size_multiplier` converts to actual BTC quantities +4. **Bucket Assignment**: Each order is assigned to its corresponding price bucket +5. **Aggregation**: The entire order book is scanned and liquidity is summed by bucket +6. **Export**: At each 10-minute interval, the current state is written to CSV + +## Installation & Setup + +### Prerequisites + +- **Bookmap** 7.4 or later +- **Bookmap Python API Developer Addon** installed and configured +- **Python** 3.8 or higher + +### Adding the Script to Bookmap + +1. Open Bookmap and ensure the Python API Developer Addon is installed +2. Open the Python API Developer interface (usually in the top menu) +3. Create a new Python script or import [`btc_walls.py`](examples/btc_walls.py:1) +4. The script is now available to run as a Bookmap addon + +## Usage Instructions + +### Running the Addon + +1. **Start the Addon** + - In the Bookmap Python API Developer interface, locate [`btc_walls.py`](examples/btc_walls.py:1) + - Click the "Run" or "Start" button + - Console output will confirm startup: + ``` + Starting BTC Walls addon... + Configuration: + Price range: 30,000 - 120,000 USD + Bucket size: 1,000 USD + Export interval: 600 seconds + Total buckets: 91 + BTC Walls addon started successfully! + Waiting for BTCUSDT instruments to be subscribed... + ``` + +2. **Subscribe to BTCUSDT Instruments** + - In Bookmap, subscribe to any BTCUSDT instrument (e.g., Binance BTCUSDT) + - The addon will automatically detect and start tracking: + ``` + Subscribing to BTC Walls tracker for: BTCUSDT + Successfully subscribed to BTCUSDT - CSV export every 600s + ``` + +3. **Monitor Operation** + - The addon runs continuously in the background + - Every 10 minutes, you'll see export confirmations: + ``` + Exported liquidity data to: btc_walls_BTCUSDT_20250125_084530.csv + ``` + +4. **Locate CSV Files** + - CSV files are saved in the Bookmap working directory + - Typically located in the Bookmap installation folder or user data directory + - Files are named: `btc_walls_{alias}_{timestamp}.csv` + +5. **Stop the Addon** + - Click "Stop" in the Python API Developer interface + - Or unsubscribe from all BTCUSDT instruments + - Cleanup confirmation will appear: + ``` + Unsubscribing from: BTCUSDT + Cleanup complete for: BTCUSDT + BTC Walls addon stopped. + ``` + +## CSV Output Format + +### File Structure + +Each CSV file contains three columns with 91 rows (one per price bucket): + +| Column | Description | Data Type | Example | +|--------|-------------|-----------|---------| +| `price` | Bucket price level (USD) | Integer | 50000 | +| `bid_btc_qty` | Total BTC quantity on bid side in this bucket | Float (4 decimals) | 125.3456 | +| `ask_btc_qty` | Total BTC quantity on ask side in this bucket | Float (4 decimals) | 87.2341 | + +### Column Definitions + +- **price**: The lower bound of the 1,000 USD price bucket (e.g., 50000 represents the range 50,000-50,999) +- **bid_btc_qty**: Sum of all BTC offered to buy (bid orders) within this price bucket +- **ask_btc_qty**: Sum of all BTC offered to sell (ask orders) within this price bucket + +### Example Output + +```csv +price,bid_btc_qty,ask_btc_qty +30000,0.0000,0.0000 +31000,0.0000,0.0000 +32000,0.0000,0.0000 +... +49000,12.3456,0.0000 +50000,125.4789,0.0000 +51000,98.2341,15.6789 +52000,0.0000,45.2314 +53000,0.0000,89.4532 +... +120000,0.0000,0.0000 +``` + +In this example: +- At $50,000 bucket: 125.48 BTC in bids, no asks +- At $51,000 bucket: 98.23 BTC in bids, 15.68 BTC in asks +- Empty buckets (0.0000) indicate no orders in that price range + +### File Naming Convention + +Format: `btc_walls_{alias}_{timestamp}.csv` + +- **alias**: The instrument identifier (e.g., "BTCUSDT") +- **timestamp**: Format `YYYYMMDD_HHMMSS` (e.g., "20250125_143027") + +Example: `btc_walls_BTCUSDT_20250125_143027.csv` + +## Configuration + +The addon uses the following default configuration constants defined in [`btc_walls.py`](examples/btc_walls.py:34): + +| Parameter | Default Value | Description | +|-----------|---------------|-------------| +| `MIN_BUCKET` | 30,000 USD | Lowest price bucket tracked | +| `MAX_BUCKET` | 120,000 USD | Highest price bucket tracked | +| `BUCKET_SIZE` | 1,000 USD | Width of each price bucket | +| `EXPORT_INTERVAL` | 600 seconds | Time between CSV exports (10 minutes) | + +### Customizing Configuration + +To modify these parameters, edit the constants at the top of the script: + +```python +# Configuration constants +BUCKET_SIZE = 1000 # Change to 500 for finer granularity +MIN_BUCKET = 30000 # Adjust range as needed +MAX_BUCKET = 120000 # Adjust range as needed +EXPORT_INTERVAL = 600 # Change to 300 for 5-minute exports +``` + +**Considerations when modifying:** + +- **Smaller `BUCKET_SIZE`**: More granular data but larger CSV files and more computation +- **Wider price range**: Covers more extreme price movements but increases file size +- **Shorter `EXPORT_INTERVAL`**: More frequent snapshots but more disk I/O and files generated + +## Technical Details + +### Order Book Tracking Mechanism + +The addon uses Bookmap's order book utilities to maintain an internal representation: + +1. **Order Book Creation**: [`bm.create_order_book()`](examples/btc_walls.py:75) initializes a SortedDict structure +2. **Depth Subscription**: [`bm.subscribe_to_depth()`](examples/btc_walls.py:83) registers for real-time updates +3. **Update Handler**: [`bm.on_depth()`](examples/btc_walls.py:136) processes each depth change +4. **Full Recalculation**: After each update, [`recalculate_liquidity_buckets()`](examples/btc_walls.py:158) scans the entire order book + +### Price and Size Conversion + +Bookmap provides depth data as integer levels that require conversion: + +**Price Conversion:** +```python +actual_price = price_level * pips +``` +- `price_level`: Integer representation of price +- `pips`: Multiplier to convert to actual USD price + +**Size Conversion:** +```python +btc_qty = size_level / size_multiplier +``` +- `size_level`: Integer representation of order size +- `size_multiplier`: Divisor to convert to actual BTC quantity + +### Threading Model + +The addon uses Python's `threading.Timer` for periodic exports: + +- **Timer Creation**: A daemon thread is created with a 600-second delay +- **Execution**: After the delay, [`export_to_csv()`](examples/btc_walls.py:196) is called +- **Restart**: After each export, a new timer is scheduled +- **Cleanup**: Timers are cancelled during unsubscribe to prevent memory leaks + +### Resource Cleanup + +Proper cleanup is handled in [`handle_unsubscribe_instrument()`](examples/btc_walls.py:91): + +1. Cancel active export timer +2. Remove order book data +3. Clear instrument metadata +4. Delete liquidity aggregation data + +## Use Cases + +### Market Depth Analysis +Track how liquidity distribution changes over time to identify accumulation or distribution patterns. + +### Liquidity Tracking for Trading Strategies +Monitor "walls" (large orders) that may act as support or resistance levels for algorithmic trading decisions. + +### Historical Liquidity Research +Build a historical dataset of order book snapshots for backtesting and market microstructure research. + +### Support/Resistance Level Identification +Identify price levels with significant liquidity concentrations that may serve as key technical levels. + +### Whale Detection +Track large orders ("whale walls") that could indicate institutional positioning or market manipulation attempts. + +### Market Making Analysis +Understand bid-ask spread dynamics and liquidity provision patterns across different price levels. + +## Limitations & Considerations + +### Instrument Restriction +- **Only BTCUSDT**: The addon filters to only process instruments containing "BTCUSDT" in the alias +- Other cryptocurrencies or instruments are automatically ignored +- To track other instruments, modify the filter in [`handle_subscribe_instrument()`](examples/btc_walls.py:57) + +### Data Feed Requirements +- **Active Depth Feed**: Requires an active market data subscription with order book depth +- No data will be collected if the depth feed is disconnected or unavailable +- Ensure your Bookmap connection supports full depth data (not just top-of-book) + +### File Accumulation +- **CSV Storage**: Files accumulate over time (one every 10 minutes = 144 files per day) +- Monitor disk space if running continuously for extended periods +- Consider implementing file rotation or archival strategies for long-term operation + +### Memory Usage +- **Continuous Operation**: The order book is kept in memory and continuously updated +- Memory usage scales with order book depth +- For instruments with very deep order books, monitor memory consumption + +### Price Range Limitation +- Only tracks prices between **$30,000 and $120,000** +- Orders outside this range are ignored +- Adjust `MIN_BUCKET` and `MAX_BUCKET` if BTC price moves significantly + +## Troubleshooting + +### Problem: No CSV Files Generated + +**Possible Causes:** +1. No BTCUSDT instrument subscribed +2. Addon not running +3. Insufficient permissions to write files + +**Solutions:** +- Verify in console: "Successfully subscribed to BTCUSDT - CSV export every 600s" +- Check addon is running (should see startup messages) +- Verify Bookmap has write permissions in its working directory +- Wait at least 10 minutes for the first export + +### Problem: CSV Files Are Empty or Have All Zeros + +**Possible Causes:** +1. No orders in the tracked price range (30,000-120,000) +2. Depth data feed not active +3. Order book not populating + +**Solutions:** +- This is **normal** if BTC price is outside the tracked range +- Verify depth data is flowing: check Bookmap's DOM (Depth of Market) panel +- Adjust `MIN_BUCKET` and `MAX_BUCKET` to match current price levels +- Check that your data subscription includes order book depth (not just trades) + +### Problem: Missing Buckets in Data + +**Expected Behavior:** +- Empty buckets (0.0000 quantities) are normal +- Not all price levels have orders +- Liquidity concentrates at certain levels + +**Not a Problem:** +- CSV will always contain all 91 buckets even if most are zero +- This allows for consistent data processing downstream + +### Problem: Addon Not Tracking After Subscribe + +**Verification Steps:** +1. Check console logs for "Subscribing to BTC Walls tracker for: {alias}" +2. Verify instrument alias contains "BTCUSDT" +3. Confirm "Successfully subscribed" message appears + +**Solutions:** +- If "Skipping non-BTCUSDT instrument" appears, you subscribed to the wrong instrument +- Subscribe to an instrument with "BTCUSDT" in the name +- Restart the addon if subscription occurred before addon started + +### Problem: Timer Not Firing / No Periodic Exports + +**Debugging:** +- Check for export messages every 10 minutes: "Exported liquidity data to: ..." +- Verify `EXPORT_INTERVAL` is set correctly (default 600 seconds) +- Check for error messages during export + +**Solutions:** +- If no errors but no exports, check threading (should use `threading.Timer`) +- Verify daemon threads are allowed in Bookmap's Python environment +- Check file system for any write errors in console + +## Example Output + +Here's a realistic example of CSV output during active market conditions: + +```csv +price,bid_btc_qty,ask_btc_qty +30000,0.0000,0.0000 +31000,0.0000,0.0000 +... +48000,2.3450,0.0000 +49000,15.6780,0.0000 +50000,125.3456,0.0000 +51000,98.2341,15.6789 +52000,45.1234,45.2314 +53000,12.4567,89.4532 +54000,8.3421,134.2156 +55000,3.2109,98.7654 +56000,0.0000,67.4321 +... +120000,0.0000,0.0000 +``` + +### Interpretation + +In this snapshot (assume current BTC price is ~$52,000): + +- **Strong bid support** at $50,000 (125.34 BTC) and $51,000 (98.23 BTC) +- **Balanced liquidity** around $52,000 (45.12 BTC bids, 45.23 BTC asks) +- **Ask wall** at $54,000 (134.22 BTC) - potential resistance +- **Thin liquidity** below $48,000 and above $56,000 +- **Empty buckets** far from current price (expected) + +### Trading Insights from This Data + +- The $50,000 level shows significant buy interest (likely psychological support) +- The $54,000 ask wall may act as resistance if price rises +- Limited downside liquidity suggests potential for faster drops if $49,000 breaks +- Monitoring changes in these levels over time reveals market sentiment shifts + +## Code Reference + +### Implementation File +- Full source code: [`examples/btc_walls.py`](examples/btc_walls.py:1) +- 282 lines of documented Python code + +### Key Functions + +| Function | Line | Purpose | +|----------|------|---------| +| [`handle_subscribe_instrument()`](examples/btc_walls.py:41) | 41 | Initializes tracking for newly subscribed BTCUSDT instruments | +| [`handle_unsubscribe_instrument()`](examples/btc_walls.py:91) | 91 | Cleans up resources when instrument is unsubscribed | +| [`on_depth()`](examples/btc_walls.py:118) | 118 | Processes incoming depth updates | +| [`recalculate_liquidity_buckets()`](examples/btc_walls.py:158) | 158 | Aggregates order book into price buckets | +| [`export_to_csv()`](examples/btc_walls.py:196) | 196 | Writes liquidity data to timestamped CSV | +| [`start_export_timer()`](examples/btc_walls.py:239) | 239 | Schedules periodic CSV exports | + +### Bookmap Python API Documentation + +For more information on the Bookmap Python API: +- Main README: [`README.md`](README.md:1) +- Quick Guide: [`QUICK_GUIDE.md`](QUICK_GUIDE.md:1) +- Other Examples: [`examples/`](examples/) + +### Related API Functions Used + +- `bm.create_addon()`: Creates addon instance +- `bm.create_order_book()`: Initializes order book data structure +- `bm.subscribe_to_depth()`: Subscribes to order book depth updates +- `bm.on_depth()`: Updates internal order book representation +- `bm.add_depth_handler()`: Registers depth update callback +- `bm.start_addon()`: Starts addon with subscribe/unsubscribe handlers +- `bm.wait_until_addon_is_turned_off()`: Keeps addon running until stopped + +--- + +**Version:** 1.0 +**Last Updated:** 2025-01-25 +**Author:** Bookmap Python API Team +**License:** See project LICENSE file \ No newline at end of file diff --git a/examples/btc_walls_snapshot.py b/examples/btc_walls_snapshot.py new file mode 100644 index 0000000..fd8a075 --- /dev/null +++ b/examples/btc_walls_snapshot.py @@ -0,0 +1,345 @@ +""" +BTC Walls Snapshot - Immediate Order Book Liquidity Snapshot Exporter + +This Bookmap addon captures a single snapshot of BTC/USDT order book liquidity (supports BTCUSDT and BTC-USDT formats) +immediately after subscription and exports to a user-configured CSV file. + +Features: +- One-time snapshot capture (no periodic exports) +- Monitors bid/ask liquidity across 171 price buckets (30,000 to 200,000 USD) +- Each bucket represents a $1 range at exact integer price levels +- Configurable CSV output path via settings UI +- Exports immediately after order book snapshot is received +- Default export directory: /Users/dannyalmaden/python_projects/python-api-bookmap/data +- Default file format: btcusdt_snapshot.csv + +Bucket Behavior: +- Bucket 30000: Aggregates orders from $30,000.00 to $30,000.99 +- Bucket 31000: Aggregates orders from $31,000.00 to $31,000.99 +- Bucket 90000: Aggregates orders from $90,000.00 to $90,000.99 +- Orders at intermediate prices (e.g., $90,500) are NOT included in any bucket + +CSV Format: + price,bid_btc_qty,ask_btc_qty + 30000,155.7,12.3 + 31000,150.3,15.8 + ... + +Author: Bookmap Python API +""" + +import bookmap as bm +import csv +import threading +import time +import os +from pathlib import Path +from collections import defaultdict +from datetime import datetime + +# Global data structures +instrument_data = {} + +# Configuration constants +BUCKET_SIZE = 1000 # USD +MIN_BUCKET = 30000 # USD +MAX_BUCKET = 200000 # USD +SNAPSHOT_DELAY = 3.0 # seconds to wait for snapshot completion after subscription + + +def handle_subscribe_instrument(addon, alias, full_name, is_crypto, pips, size_multiplier, instrument_multiplier, supported_features): + """ + Called when an instrument is subscribed in Bookmap. + Only processes instruments containing 'BTCUSDT' or 'BTC-USDT' in their alias. + + Args: + addon: The addon instance + alias: Instrument alias/symbol + full_name: Full instrument name + is_crypto: Boolean indicating if this is a crypto instrument + pips: Price tick size multiplier + size_multiplier: Size/volume multiplier for converting to actual quantities + instrument_multiplier: Additional instrument multiplier + supported_features: Features supported by this instrument + """ + # Filter: Only process BTC/USDT instruments (supports both BTCUSDT and BTC-USDT formats) + is_btc_usdt = "BTCUSDT" in alias or "BTC-USDT" in alias + if not is_btc_usdt: + print(f"Skipping non-BTC/USDT instrument: {alias}", flush=True) + return + + print(f"Subscribing to BTC Walls Snapshot for: {alias}", flush=True) + + # Initialize instrument data structure with timestamped CSV path + csv_path_default = f"/Users/dannyalmaden/python_projects/python-api-bookmap/data/btcusdt_snapshot.csv" + + instrument_data[alias] = { + "full_name": full_name, + "is_crypto": is_crypto, + "pips": pips, + "size_multiplier": size_multiplier, + "instrument_multiplier": instrument_multiplier, + "order_book": bm.create_order_book(), + "csv_path": csv_path_default, + "snapshot_received": False, + "snapshot_timer": None + } + + # Add settings parameter for CSV output path + bm.add_string_settings_parameter(addon, alias, "CSV Output Path", csv_path_default) + + # Subscribe to depth data + req_id = hash(alias) % 1000000 # Generate unique request ID + bm.subscribe_to_depth(addon, alias, req_id) + + # Start timer to export snapshot after depth subscription + timer = threading.Timer(SNAPSHOT_DELAY, export_snapshot_to_csv, args=[addon, alias]) + timer.daemon = True + timer.start() + instrument_data[alias]["snapshot_timer"] = timer + + print(f"Successfully subscribed to {alias} - Snapshot will export in {SNAPSHOT_DELAY}s", flush=True) + print(f"Snapshot will be exported to: {csv_path_default}", flush=True) + + +def handle_unsubscribe_instrument(addon, alias): + """ + Called when an instrument is unsubscribed. + Cleans up resources and cancels timers. + + Args: + addon: The addon instance + alias: Instrument alias being unsubscribed + """ + print(f"Unsubscribing from: {alias}", flush=True) + + # Cancel snapshot timer if exists + if alias in instrument_data and instrument_data[alias]["snapshot_timer"] is not None: + instrument_data[alias]["snapshot_timer"].cancel() + + # Clean up data structures + if alias in instrument_data: + del instrument_data[alias] + + + print(f"Cleanup complete for: {alias}", flush=True) + + +def on_depth(addon, alias, is_bid, price_level, size_level): + """ + Handler for depth updates. Updates the order book with incoming depth data. + + Args: + addon: The addon instance + alias: Instrument alias + is_bid: True for bid/buy side, False for ask/sell side + price_level: Integer price level (must be multiplied by pips) + size_level: Integer size level (must be divided by size_multiplier) + """ + # Skip if instrument not tracked + if alias not in instrument_data: + return + + # Update internal order book + order_book = instrument_data[alias]["order_book"] + bm.on_depth(order_book, is_bid, price_level, size_level) + + + +def on_settings_change(addon, alias, setting_name, field_type, new_value): + """ + Handler for settings parameter changes. Updates CSV path when user changes it. + + Args: + addon: The addon instance + alias: Instrument alias + setting_name: Name of the setting that changed + field_type: Type of the field (string, int, etc.) + new_value: New value of the setting + """ + if setting_name == "CSV Output Path" and alias in instrument_data: + old_path = instrument_data[alias]["csv_path"] + instrument_data[alias]["csv_path"] = new_value + print(f"CSV output path updated for {alias}:", flush=True) + print(f" Old: {old_path}", flush=True) + print(f" New: {new_value}", flush=True) + + +def recalculate_liquidity_buckets(alias, order_book, pips, size_multiplier): + """ + Calculates aggregated liquidity for all price buckets by scanning + the entire order book. + + Each bucket only aggregates orders within a $1 range at exact integer price levels. + For example: + - Bucket 90000: Only includes orders from $90,000.00 to $90,000.99 + - Bucket 91000: Only includes orders from $91,000.00 to $91,000.99 + - Orders at intermediate prices (e.g., $90,500) are excluded from all buckets + + Args: + alias: Instrument alias + order_book: Order book dictionary with 'bids' and 'asks' SortedDicts + pips: Price multiplier for converting price levels to actual prices + size_multiplier: Divisor for converting size levels to actual BTC quantities + + Returns: + Dictionary mapping bucket prices to {'bid': float, 'ask': float} + """ + # Initialize all buckets to zero + liquidity_data = {} + for bucket in range(MIN_BUCKET, MAX_BUCKET + 1, BUCKET_SIZE): + liquidity_data[bucket] = {"bid": 0.0, "ask": 0.0} + + # Aggregate bid liquidity + for price_level, size_level in order_book["bids"].items(): + actual_price = price_level * pips + bucket = int(actual_price // BUCKET_SIZE) * BUCKET_SIZE + + # Only include orders where the integer price matches the bucket exactly + # e.g., for bucket 90000, only prices from 90000.00 to 90000.99 + if MIN_BUCKET <= bucket <= MAX_BUCKET and size_level > 0: + # Check if price is within $1 of the bucket (bucket to bucket+0.99) + if bucket <= actual_price < bucket + 1: + btc_qty = size_level / size_multiplier + liquidity_data[bucket]["bid"] += btc_qty + + # Aggregate ask liquidity + for price_level, size_level in order_book["asks"].items(): + actual_price = price_level * pips + bucket = int(actual_price // BUCKET_SIZE) * BUCKET_SIZE + + # Only include orders where the integer price matches the bucket exactly + # e.g., for bucket 90000, only prices from 90000.00 to 90000.99 + if MIN_BUCKET <= bucket <= MAX_BUCKET and size_level > 0: + # Check if price is within $1 of the bucket (bucket to bucket+0.99) + if bucket <= actual_price < bucket + 1: + btc_qty = size_level / size_multiplier + liquidity_data[bucket]["ask"] += btc_qty + + return liquidity_data + + +def export_snapshot_to_csv(addon, alias): + """ + Exports order book snapshot to CSV file. Only exports once per subscription. + + Args: + addon: The addon instance + alias: Instrument alias + """ + if alias not in instrument_data: + print(f"No instrument data available for {alias}", flush=True) + return + + # Check if already exported + if instrument_data[alias]["snapshot_received"]: + print(f"Snapshot already exported for {alias}", flush=True) + return + + # Get instrument data + csv_path = instrument_data[alias]["csv_path"] + order_book = instrument_data[alias]["order_book"] + pips = instrument_data[alias]["pips"] + size_multiplier = instrument_data[alias]["size_multiplier"] + + # Create the data directory if it doesn't exist + csv_dir = os.path.dirname(csv_path) + if csv_dir and not os.path.exists(csv_dir): + try: + os.makedirs(csv_dir, exist_ok=True) + print(f"Created directory: {csv_dir}", flush=True) + except Exception as e: + print(f"ERROR: Failed to create output directory {csv_dir}: {e}", flush=True) + return + elif csv_dir: + print(f"Output directory: {csv_dir}", flush=True) + + # Check if order book has data + bid_count = len(order_book["bids"]) + ask_count = len(order_book["asks"]) + + print(f"Capturing snapshot for {alias}:", flush=True) + print(f" Order book depth: {bid_count} bids, {ask_count} asks", flush=True) + + if bid_count == 0 and ask_count == 0: + print(f"WARNING: Order book is empty for {alias}. Snapshot may not be complete.", flush=True) + + try: + # Calculate aggregated liquidity buckets + liquidity_data = recalculate_liquidity_buckets(alias, order_book, pips, size_multiplier) + + # Write to CSV + with open(csv_path, 'w', newline='') as csvfile: + writer = csv.writer(csvfile) + + # Write header + writer.writerow(['price', 'bid_btc_qty', 'ask_btc_qty']) + + # Write data for all buckets in order + for bucket in range(MIN_BUCKET, MAX_BUCKET + 1, BUCKET_SIZE): + bid_qty = liquidity_data.get(bucket, {}).get("bid", 0.0) + ask_qty = liquidity_data.get(bucket, {}).get("ask", 0.0) + + # Format quantities to 4 decimal places + writer.writerow([bucket, f"{bid_qty:.4f}", f"{ask_qty:.4f}"]) + + # Mark snapshot as received + instrument_data[alias]["snapshot_received"] = True + + print(f"✓ Snapshot successfully exported to: {csv_path}", flush=True) + print(f" Total buckets: {len(liquidity_data)}", flush=True) + + # Calculate total liquidity for summary + total_bid_btc = sum(data["bid"] for data in liquidity_data.values()) + total_ask_btc = sum(data["ask"] for data in liquidity_data.values()) + print(f" Total bid liquidity: {total_bid_btc:.4f} BTC", flush=True) + print(f" Total ask liquidity: {total_ask_btc:.4f} BTC", flush=True) + + except PermissionError as e: + print(f"ERROR: Permission denied writing to {csv_path}", flush=True) + print(f" Please check file permissions or change the CSV path in settings.", flush=True) + except OSError as e: + print(f"ERROR: Invalid file path {csv_path}", flush=True) + print(f" {str(e)}", flush=True) + print(f" Please update the CSV path in addon settings.", flush=True) + except Exception as e: + print(f"ERROR: Failed to export snapshot for {alias}: {str(e)}", flush=True) + + +if __name__ == "__main__": + print("=" * 70, flush=True) + print("BTC Walls Snapshot - Immediate Order Book Liquidity Exporter", flush=True) + print("=" * 70, flush=True) + print(f"Configuration:", flush=True) + print(f" Price range: {MIN_BUCKET:,} - {MAX_BUCKET:,} USD", flush=True) + print(f" Bucket interval: {BUCKET_SIZE:,} USD (labels only)", flush=True) + print(f" Aggregation range: $1 per bucket (exact integer prices)", flush=True) + print(f" Total buckets: {(MAX_BUCKET - MIN_BUCKET) // BUCKET_SIZE + 1}", flush=True) + print(f" Snapshot delay: {SNAPSHOT_DELAY} seconds", flush=True) + print("", flush=True) + print("Behavior:", flush=True) + print(" • Only tracks instruments containing 'BTCUSDT' or 'BTC-USDT'", flush=True) + print(" • Captures ONE snapshot per subscription", flush=True) + print(" • Exports to CSV immediately after snapshot received", flush=True) + print(" • CSV path configurable via addon settings", flush=True) + print("=" * 70, flush=True) + + # Create addon instance + addon = bm.create_addon() + + # Register handlers + bm.add_depth_handler(addon, on_depth) + bm.add_on_setting_change_handler(addon, on_settings_change) + + # Start addon with subscribe/unsubscribe handlers + bm.start_addon(addon, handle_subscribe_instrument, handle_unsubscribe_instrument) + + print("BTC Walls Snapshot addon started successfully!", flush=True) + print("Waiting for BTCUSDT instruments to be subscribed...", flush=True) + print("", flush=True) + + # Wait until addon is turned off + bm.wait_until_addon_is_turned_off(addon) + + print("", flush=True) + print("BTC Walls Snapshot addon stopped.", flush=True) \ No newline at end of file diff --git a/examples/eth_walls_snapshot.py b/examples/eth_walls_snapshot.py new file mode 100644 index 0000000..6ebce89 --- /dev/null +++ b/examples/eth_walls_snapshot.py @@ -0,0 +1,347 @@ +""" +ETH Walls Snapshot - Immediate Order Book Liquidity Snapshot Exporter + +This Bookmap addon captures a single snapshot of ETH/USDT order book liquidity (supports both ETHUSDT and ETH-USDT formats) +immediately after subscription and exports to a user-configured CSV file. + +Features: +- One-time snapshot capture (no periodic exports) +- Monitors bid/ask liquidity across 501 price buckets (1,000 to 6,000 USD) +- Each bucket represents a $1 range at exact integer price levels +- Configurable CSV output path via settings UI +- Exports immediately after order book snapshot is received +- Default export directory: /Users/dannyalmaden/python_projects/python-api-bookmap/data +- Default file format: ethusdt_snapshot.csv + +Bucket Behavior: +- Bucket 1000: Aggregates orders from $1,000.00 to $1,000.99 +- Bucket 1010: Aggregates orders from $1,010.00 to $1,010.99 +- Bucket 1020: Aggregates orders from $1,020.00 to $1,020.99 +- Bucket 1030: Aggregates orders from $1,030.00 to $1,030.99 + +CSV Format: + price,bid_eth_qty,ask_eth_qty + 1000,155.7,12.3 + 2000,150.3,15.8 + ... + +Author: Bookmap Python API +""" + +import bookmap as bm +import csv +import threading +import time +import os +from pathlib import Path +from collections import defaultdict +from datetime import datetime + +# Global data structures +instrument_data = {} + +# Configuration constants +BUCKET_SIZE = 10 # USD +MIN_BUCKET = 1000 # USD +MAX_BUCKET = 6000 # USD +SNAPSHOT_DELAY = 3.0 # seconds to wait for snapshot completion after subscription + + +def handle_subscribe_instrument(addon, alias, full_name, is_crypto, pips, size_multiplier, instrument_multiplier, supported_features): + """ + Called when an instrument is subscribed in Bookmap. + Only processes instruments containing 'ETHUSDT' in their alias. + + Args: + addon: The addon instance + alias: Instrument alias/symbol + full_name: Full instrument name + is_crypto: Boolean indicating if this is a crypto instrument + pips: Price tick size multiplier + size_multiplier: Size/volume multiplier for converting to actual quantities + instrument_multiplier: Additional instrument multiplier + supported_features: Features supported by this instrument + """ + # Filter: Only process ETH/USDT instruments (supports both ETHUSDT and ETH-USDT formats) + # Check if alias contains either ETHUSDT or ETH-USDT + is_eth_usdt = "ETHUSDT" in alias or "ETH-USDT" in alias + if not is_eth_usdt: + print(f"Skipping non-ETH/USDT instrument: {alias}", flush=True) + return + + print(f"Subscribing to ETH Walls Snapshot for: {alias}", flush=True) + + # Initialize instrument data structure with timestamped CSV path + csv_path_default = f"/Users/dannyalmaden/python_projects/python-api-bookmap/data/ethusdt_snapshot.csv" + + instrument_data[alias] = { + "full_name": full_name, + "is_crypto": is_crypto, + "pips": pips, + "size_multiplier": size_multiplier, + "instrument_multiplier": instrument_multiplier, + "order_book": bm.create_order_book(), + "csv_path": csv_path_default, + "snapshot_received": False, + "snapshot_timer": None + } + + # Add settings parameter for CSV output path + bm.add_string_settings_parameter(addon, alias, "CSV Output Path", csv_path_default) + + # Subscribe to depth data + req_id = hash(alias) % 1000000 # Generate unique request ID + bm.subscribe_to_depth(addon, alias, req_id) + + # Start timer to export snapshot after depth subscription + timer = threading.Timer(SNAPSHOT_DELAY, export_snapshot_to_csv, args=[addon, alias]) + timer.daemon = True + timer.start() + instrument_data[alias]["snapshot_timer"] = timer + + print(f"Successfully subscribed to {alias} - Snapshot will export in {SNAPSHOT_DELAY}s", flush=True) + print(f"Snapshot will be exported to: {csv_path_default}", flush=True) + + +def handle_unsubscribe_instrument(addon, alias): + """ + Called when an instrument is unsubscribed. + Cleans up resources and cancels timers. + + Args: + addon: The addon instance + alias: Instrument alias being unsubscribed + """ + print(f"Unsubscribing from: {alias}", flush=True) + + # Cancel snapshot timer if exists + if alias in instrument_data and instrument_data[alias]["snapshot_timer"] is not None: + instrument_data[alias]["snapshot_timer"].cancel() + + # Clean up data structures + if alias in instrument_data: + del instrument_data[alias] + + + print(f"Cleanup complete for: {alias}", flush=True) + + +def on_depth(addon, alias, is_bid, price_level, size_level): + """ + Handler for depth updates. Updates the order book with incoming depth data. + + Args: + addon: The addon instance + alias: Instrument alias + is_bid: True for bid/buy side, False for ask/sell side + price_level: Integer price level (must be multiplied by pips) + size_level: Integer size level (must be divided by size_multiplier) + """ + # Skip if instrument not tracked + if alias not in instrument_data: + return + + # Update internal order book + order_book = instrument_data[alias]["order_book"] + bm.on_depth(order_book, is_bid, price_level, size_level) + + + +def on_settings_change(addon, alias, setting_name, field_type, new_value): + """ + Handler for settings parameter changes. Updates CSV path when user changes it. + + Args: + addon: The addon instance + alias: Instrument alias + setting_name: Name of the setting that changed + field_type: Type of the field (string, int, etc.) + new_value: New value of the setting + """ + if setting_name == "CSV Output Path" and alias in instrument_data: + old_path = instrument_data[alias]["csv_path"] + instrument_data[alias]["csv_path"] = new_value + print(f"CSV output path updated for {alias}:", flush=True) + print(f" Old: {old_path}", flush=True) + print(f" New: {new_value}", flush=True) + + +def recalculate_liquidity_buckets(alias, order_book, pips, size_multiplier): + """ + Calculates aggregated liquidity for all price buckets by scanning + the entire order book. + + Each bucket only aggregates orders within a $1 range at exact integer price levels. + For example: + - Bucket 1000: Only includes orders from $1,000.00 to $1,000.99 + - Bucket 1010: Only includes orders from $1,010.00 to $1,010.99 + - Bucket 1020: Only includes orders from $1,020.00 to $1,020.99 + - Orders at intermediate prices (e.g., $1,005) are excluded from all buckets + + Args: + alias: Instrument alias + order_book: Order book dictionary with 'bids' and 'asks' SortedDicts + pips: Price multiplier for converting price levels to actual prices + size_multiplier: Divisor for converting size levels to actual ETH quantities + + Returns: + Dictionary mapping bucket prices to {'bid': float, 'ask': float} + """ + # Initialize all buckets to zero + liquidity_data = {} + for bucket in range(MIN_BUCKET, MAX_BUCKET + 1, BUCKET_SIZE): + liquidity_data[bucket] = {"bid": 0.0, "ask": 0.0} + + # Aggregate bid liquidity + for price_level, size_level in order_book["bids"].items(): + actual_price = price_level * pips + bucket = int(actual_price // BUCKET_SIZE) * BUCKET_SIZE + + # Only include orders where the integer price matches the bucket exactly + # e.g., for bucket 3000, only prices from 3000.00 to 3000.99 + if MIN_BUCKET <= bucket <= MAX_BUCKET and size_level > 0: + # Check if price is within $1 of the bucket (bucket to bucket+0.99) + if bucket <= actual_price < bucket + 1: + eth_qty = size_level / size_multiplier + liquidity_data[bucket]["bid"] += eth_qty + + # Aggregate ask liquidity + for price_level, size_level in order_book["asks"].items(): + actual_price = price_level * pips + bucket = int(actual_price // BUCKET_SIZE) * BUCKET_SIZE + + # Only include orders where the integer price matches the bucket exactly + # e.g., for bucket 3000, only prices from 3000.00 to 3000.99 + if MIN_BUCKET <= bucket <= MAX_BUCKET and size_level > 0: + # Check if price is within $1 of the bucket (bucket to bucket+0.99) + if bucket <= actual_price < bucket + 1: + eth_qty = size_level / size_multiplier + liquidity_data[bucket]["ask"] += eth_qty + + return liquidity_data + + +def export_snapshot_to_csv(addon, alias): + """ + Exports order book snapshot to CSV file. Only exports once per subscription. + + Args: + addon: The addon instance + alias: Instrument alias + """ + if alias not in instrument_data: + print(f"No instrument data available for {alias}", flush=True) + return + + # Check if already exported + if instrument_data[alias]["snapshot_received"]: + print(f"Snapshot already exported for {alias}", flush=True) + return + + # Get instrument data + csv_path = instrument_data[alias]["csv_path"] + order_book = instrument_data[alias]["order_book"] + pips = instrument_data[alias]["pips"] + size_multiplier = instrument_data[alias]["size_multiplier"] + + # Create the data directory if it doesn't exist + csv_dir = os.path.dirname(csv_path) + if csv_dir and not os.path.exists(csv_dir): + try: + os.makedirs(csv_dir, exist_ok=True) + print(f"Created directory: {csv_dir}", flush=True) + except Exception as e: + print(f"ERROR: Failed to create output directory {csv_dir}: {e}", flush=True) + return + elif csv_dir: + print(f"Output directory: {csv_dir}", flush=True) + + # Check if order book has data + bid_count = len(order_book["bids"]) + ask_count = len(order_book["asks"]) + + print(f"Capturing snapshot for {alias}:", flush=True) + print(f" Order book depth: {bid_count} bids, {ask_count} asks", flush=True) + + if bid_count == 0 and ask_count == 0: + print(f"WARNING: Order book is empty for {alias}. Snapshot may not be complete.", flush=True) + + try: + # Calculate aggregated liquidity buckets + liquidity_data = recalculate_liquidity_buckets(alias, order_book, pips, size_multiplier) + + # Write to CSV + with open(csv_path, 'w', newline='') as csvfile: + writer = csv.writer(csvfile) + + # Write header + writer.writerow(['price', 'bid_eth_qty', 'ask_eth_qty']) + + # Write data for all buckets in order + for bucket in range(MIN_BUCKET, MAX_BUCKET + 1, BUCKET_SIZE): + bid_qty = liquidity_data.get(bucket, {}).get("bid", 0.0) + ask_qty = liquidity_data.get(bucket, {}).get("ask", 0.0) + + # Format quantities to 4 decimal places + writer.writerow([bucket, f"{bid_qty:.4f}", f"{ask_qty:.4f}"]) + + # Mark snapshot as received + instrument_data[alias]["snapshot_received"] = True + + print(f"✓ Snapshot successfully exported to: {csv_path}", flush=True) + print(f" Total buckets: {len(liquidity_data)}", flush=True) + + # Calculate total liquidity for summary + total_bid_eth = sum(data["bid"] for data in liquidity_data.values()) + total_ask_eth = sum(data["ask"] for data in liquidity_data.values()) + print(f" Total bid liquidity: {total_bid_eth:.4f} ETH", flush=True) + print(f" Total ask liquidity: {total_ask_eth:.4f} ETH", flush=True) + + except PermissionError as e: + print(f"ERROR: Permission denied writing to {csv_path}", flush=True) + print(f" Please check file permissions or change the CSV path in settings.", flush=True) + except OSError as e: + print(f"ERROR: Invalid file path {csv_path}", flush=True) + print(f" {str(e)}", flush=True) + print(f" Please update the CSV path in addon settings.", flush=True) + except Exception as e: + print(f"ERROR: Failed to export snapshot for {alias}: {str(e)}", flush=True) + + +if __name__ == "__main__": + print("=" * 70, flush=True) + print("ETH Walls Snapshot - Immediate Order Book Liquidity Exporter", flush=True) + print("=" * 70, flush=True) + print(f"Configuration:", flush=True) + print(f" Price range: {MIN_BUCKET:,} - {MAX_BUCKET:,} USD", flush=True) + print(f" Bucket interval: ${BUCKET_SIZE} (labels at $10 increments)", flush=True) + print(f" Aggregation range: $1 per bucket (exact integer prices)", flush=True) + print(f" Total buckets: {(MAX_BUCKET - MIN_BUCKET) // BUCKET_SIZE + 1}", flush=True) + print(f" Snapshot delay: {SNAPSHOT_DELAY} seconds", flush=True) + print("", flush=True) + print("Behavior:", flush=True) + print(" • Only tracks instruments containing 'ETHUSDT' or 'ETH-USDT'", flush=True) + print(" • Captures ONE snapshot per subscription", flush=True) + print(" • Exports to CSV immediately after snapshot received", flush=True) + print(" • CSV path configurable via addon settings", flush=True) + print("=" * 70, flush=True) + + # Create addon instance + addon = bm.create_addon() + + # Register handlers + bm.add_depth_handler(addon, on_depth) + bm.add_on_setting_change_handler(addon, on_settings_change) + + # Start addon with subscribe/unsubscribe handlers + bm.start_addon(addon, handle_subscribe_instrument, handle_unsubscribe_instrument) + + print("ETH Walls Snapshot addon started successfully!", flush=True) + print("Waiting for ETHUSDT instruments to be subscribed...", flush=True) + print("", flush=True) + + # Wait until addon is turned off + bm.wait_until_addon_is_turned_off(addon) + + print("", flush=True) + print("ETH Walls Snapshot addon stopped.", flush=True) \ No newline at end of file diff --git a/examples/sol_walls_snapshot.py b/examples/sol_walls_snapshot.py new file mode 100644 index 0000000..49ee774 --- /dev/null +++ b/examples/sol_walls_snapshot.py @@ -0,0 +1,345 @@ +""" +SOL Walls Snapshot - Immediate Order Book Liquidity Snapshot Exporter + +This Bookmap addon captures a single snapshot of SOL/USDT order book liquidity (supports SOLUSDT and SOL-USDT formats) +immediately after subscription and exports to a user-configured CSV file. + +Features: +- One-time snapshot capture (no periodic exports) +- Monitors bid/ask liquidity across 271 price buckets (30 to 300 USD) +- Each bucket represents a $1 range at exact integer price levels +- Configurable CSV output path via settings UI +- Exports immediately after order book snapshot is received +- Default export directory: /Users/dannyalmaden/python_projects/python-api-bookmap/data +- Default file format: solusdt_snapshot.csv + +Bucket Behavior: +- Bucket 140: Aggregates orders from $140.00 to $140.99 +- Bucket 141: Aggregates orders from $141.00 to $141.99 +- Bucket 200: Aggregates orders from $200.00 to $200.99 +- Orders at intermediate prices (e.g., $140.50) are NOT included in any bucket + +CSV Format: + price,bid_sol_qty,ask_sol_qty + 50,1234.5678,987.6543 + 51,1150.3421,1012.8976 + ... + +Author: Bookmap Python API +""" + +import bookmap as bm +import csv +import threading +import time +import os +from pathlib import Path +from collections import defaultdict +from datetime import datetime + +# Global data structures +instrument_data = {} + +# Configuration constants +BUCKET_SIZE = 1 # USD +MIN_BUCKET = 30 # USD +MAX_BUCKET = 300 # USD +SNAPSHOT_DELAY = 3.0 # seconds to wait for snapshot completion after subscription + + +def handle_subscribe_instrument(addon, alias, full_name, is_crypto, pips, size_multiplier, instrument_multiplier, supported_features): + """ + Called when an instrument is subscribed in Bookmap. + Only processes instruments containing 'SOLUSDT' or 'SOL-USDT' in their alias. + + Args: + addon: The addon instance + alias: Instrument alias/symbol + full_name: Full instrument name + is_crypto: Boolean indicating if this is a crypto instrument + pips: Price tick size multiplier + size_multiplier: Size/volume multiplier for converting to actual quantities + instrument_multiplier: Additional instrument multiplier + supported_features: Features supported by this instrument + """ + # Filter: Only process SOL/USDT instruments (supports both SOLUSDT and SOL-USDT formats) + is_sol_usdt = "SOLUSDT" in alias or "SOL-USDT" in alias + if not is_sol_usdt: + print(f"Skipping non-SOL/USDT instrument: {alias}", flush=True) + return + + print(f"Subscribing to SOL Walls Snapshot for: {alias}", flush=True) + + # Initialize instrument data structure with timestamped CSV path + csv_path_default = f"/Users/dannyalmaden/python_projects/python-api-bookmap/data/solusdt_snapshot.csv" + + instrument_data[alias] = { + "full_name": full_name, + "is_crypto": is_crypto, + "pips": pips, + "size_multiplier": size_multiplier, + "instrument_multiplier": instrument_multiplier, + "order_book": bm.create_order_book(), + "csv_path": csv_path_default, + "snapshot_received": False, + "snapshot_timer": None + } + + # Add settings parameter for CSV output path + bm.add_string_settings_parameter(addon, alias, "CSV Output Path", csv_path_default) + + # Subscribe to depth data + req_id = hash(alias) % 1000000 # Generate unique request ID + bm.subscribe_to_depth(addon, alias, req_id) + + # Start timer to export snapshot after depth subscription + timer = threading.Timer(SNAPSHOT_DELAY, export_snapshot_to_csv, args=[addon, alias]) + timer.daemon = True + timer.start() + instrument_data[alias]["snapshot_timer"] = timer + + print(f"Successfully subscribed to {alias} - Snapshot will export in {SNAPSHOT_DELAY}s", flush=True) + print(f"Snapshot will be exported to: {csv_path_default}", flush=True) + + +def handle_unsubscribe_instrument(addon, alias): + """ + Called when an instrument is unsubscribed. + Cleans up resources and cancels timers. + + Args: + addon: The addon instance + alias: Instrument alias being unsubscribed + """ + print(f"Unsubscribing from: {alias}", flush=True) + + # Cancel snapshot timer if exists + if alias in instrument_data and instrument_data[alias]["snapshot_timer"] is not None: + instrument_data[alias]["snapshot_timer"].cancel() + + # Clean up data structures + if alias in instrument_data: + del instrument_data[alias] + + + print(f"Cleanup complete for: {alias}", flush=True) + + +def on_depth(addon, alias, is_bid, price_level, size_level): + """ + Handler for depth updates. Updates the order book with incoming depth data. + + Args: + addon: The addon instance + alias: Instrument alias + is_bid: True for bid/buy side, False for ask/sell side + price_level: Integer price level (must be multiplied by pips) + size_level: Integer size level (must be divided by size_multiplier) + """ + # Skip if instrument not tracked + if alias not in instrument_data: + return + + # Update internal order book + order_book = instrument_data[alias]["order_book"] + bm.on_depth(order_book, is_bid, price_level, size_level) + + + +def on_settings_change(addon, alias, setting_name, field_type, new_value): + """ + Handler for settings parameter changes. Updates CSV path when user changes it. + + Args: + addon: The addon instance + alias: Instrument alias + setting_name: Name of the setting that changed + field_type: Type of the field (string, int, etc.) + new_value: New value of the setting + """ + if setting_name == "CSV Output Path" and alias in instrument_data: + old_path = instrument_data[alias]["csv_path"] + instrument_data[alias]["csv_path"] = new_value + print(f"CSV output path updated for {alias}:", flush=True) + print(f" Old: {old_path}", flush=True) + print(f" New: {new_value}", flush=True) + + +def recalculate_liquidity_buckets(alias, order_book, pips, size_multiplier): + """ + Calculates aggregated liquidity for all price buckets by scanning + the entire order book. + + Each bucket only aggregates orders within a $1 range at exact integer price levels. + For example: + - Bucket 140: Only includes orders from $140.00 to $140.99 + - Bucket 141: Only includes orders from $141.00 to $141.99 + - Orders at intermediate prices (e.g., $140.50) are excluded from all buckets + + Args: + alias: Instrument alias + order_book: Order book dictionary with 'bids' and 'asks' SortedDicts + pips: Price multiplier for converting price levels to actual prices + size_multiplier: Divisor for converting size levels to actual SOL quantities + + Returns: + Dictionary mapping bucket prices to {'bid': float, 'ask': float} + """ + # Initialize all buckets to zero + liquidity_data = {} + for bucket in range(MIN_BUCKET, MAX_BUCKET + 1, BUCKET_SIZE): + liquidity_data[bucket] = {"bid": 0.0, "ask": 0.0} + + # Aggregate bid liquidity + for price_level, size_level in order_book["bids"].items(): + actual_price = price_level * pips + bucket = int(actual_price // BUCKET_SIZE) * BUCKET_SIZE + + # Only include orders where the integer price matches the bucket exactly + # e.g., for bucket 140, only prices from 140.00 to 140.99 + if MIN_BUCKET <= bucket <= MAX_BUCKET and size_level > 0: + # Check if price is within $1 of the bucket (bucket to bucket+0.99) + if bucket <= actual_price < bucket + 1: + sol_qty = size_level / size_multiplier + liquidity_data[bucket]["bid"] += sol_qty + + # Aggregate ask liquidity + for price_level, size_level in order_book["asks"].items(): + actual_price = price_level * pips + bucket = int(actual_price // BUCKET_SIZE) * BUCKET_SIZE + + # Only include orders where the integer price matches the bucket exactly + # e.g., for bucket 140, only prices from 140.00 to 140.99 + if MIN_BUCKET <= bucket <= MAX_BUCKET and size_level > 0: + # Check if price is within $1 of the bucket (bucket to bucket+0.99) + if bucket <= actual_price < bucket + 1: + sol_qty = size_level / size_multiplier + liquidity_data[bucket]["ask"] += sol_qty + + return liquidity_data + + +def export_snapshot_to_csv(addon, alias): + """ + Exports order book snapshot to CSV file. Only exports once per subscription. + + Args: + addon: The addon instance + alias: Instrument alias + """ + if alias not in instrument_data: + print(f"No instrument data available for {alias}", flush=True) + return + + # Check if already exported + if instrument_data[alias]["snapshot_received"]: + print(f"Snapshot already exported for {alias}", flush=True) + return + + # Get instrument data + csv_path = instrument_data[alias]["csv_path"] + order_book = instrument_data[alias]["order_book"] + pips = instrument_data[alias]["pips"] + size_multiplier = instrument_data[alias]["size_multiplier"] + + # Create the data directory if it doesn't exist + csv_dir = os.path.dirname(csv_path) + if csv_dir and not os.path.exists(csv_dir): + try: + os.makedirs(csv_dir, exist_ok=True) + print(f"Created directory: {csv_dir}", flush=True) + except Exception as e: + print(f"ERROR: Failed to create output directory {csv_dir}: {e}", flush=True) + return + elif csv_dir: + print(f"Output directory: {csv_dir}", flush=True) + + # Check if order book has data + bid_count = len(order_book["bids"]) + ask_count = len(order_book["asks"]) + + print(f"Capturing snapshot for {alias}:", flush=True) + print(f" Order book depth: {bid_count} bids, {ask_count} asks", flush=True) + + if bid_count == 0 and ask_count == 0: + print(f"WARNING: Order book is empty for {alias}. Snapshot may not be complete.", flush=True) + + try: + # Calculate aggregated liquidity buckets + liquidity_data = recalculate_liquidity_buckets(alias, order_book, pips, size_multiplier) + + # Write to CSV + with open(csv_path, 'w', newline='') as csvfile: + writer = csv.writer(csvfile) + + # Write header + writer.writerow(['price', 'bid_sol_qty', 'ask_sol_qty']) + + # Write data for all buckets in order + for bucket in range(MIN_BUCKET, MAX_BUCKET + 1, BUCKET_SIZE): + bid_qty = liquidity_data.get(bucket, {}).get("bid", 0.0) + ask_qty = liquidity_data.get(bucket, {}).get("ask", 0.0) + + # Format quantities to 4 decimal places + writer.writerow([bucket, f"{bid_qty:.4f}", f"{ask_qty:.4f}"]) + + # Mark snapshot as received + instrument_data[alias]["snapshot_received"] = True + + print(f"✓ Snapshot successfully exported to: {csv_path}", flush=True) + print(f" Total buckets: {len(liquidity_data)}", flush=True) + + # Calculate total liquidity for summary + total_bid_sol = sum(data["bid"] for data in liquidity_data.values()) + total_ask_sol = sum(data["ask"] for data in liquidity_data.values()) + print(f" Total bid liquidity: {total_bid_sol:.4f} SOL", flush=True) + print(f" Total ask liquidity: {total_ask_sol:.4f} SOL", flush=True) + + except PermissionError as e: + print(f"ERROR: Permission denied writing to {csv_path}", flush=True) + print(f" Please check file permissions or change the CSV path in settings.", flush=True) + except OSError as e: + print(f"ERROR: Invalid file path {csv_path}", flush=True) + print(f" {str(e)}", flush=True) + print(f" Please update the CSV path in addon settings.", flush=True) + except Exception as e: + print(f"ERROR: Failed to export snapshot for {alias}: {str(e)}", flush=True) + + +if __name__ == "__main__": + print("=" * 70, flush=True) + print("SOL Walls Snapshot - Immediate Order Book Liquidity Exporter", flush=True) + print("=" * 70, flush=True) + print(f"Configuration:", flush=True) + print(f" Price range: {MIN_BUCKET:,} - {MAX_BUCKET:,} USD", flush=True) + print(f" Bucket interval: {BUCKET_SIZE:,} USD", flush=True) + print(f" Aggregation range: $1 per bucket (exact integer prices)", flush=True) + print(f" Total buckets: {(MAX_BUCKET - MIN_BUCKET) // BUCKET_SIZE + 1}", flush=True) + print(f" Snapshot delay: {SNAPSHOT_DELAY} seconds", flush=True) + print("", flush=True) + print("Behavior:", flush=True) + print(" • Only tracks instruments containing 'SOLUSDT' or 'SOL-USDT'", flush=True) + print(" • Captures ONE snapshot per subscription", flush=True) + print(" • Exports to CSV immediately after snapshot received", flush=True) + print(" • CSV path configurable via addon settings", flush=True) + print("=" * 70, flush=True) + + # Create addon instance + addon = bm.create_addon() + + # Register handlers + bm.add_depth_handler(addon, on_depth) + bm.add_on_setting_change_handler(addon, on_settings_change) + + # Start addon with subscribe/unsubscribe handlers + bm.start_addon(addon, handle_subscribe_instrument, handle_unsubscribe_instrument) + + print("SOL Walls Snapshot addon started successfully!", flush=True) + print("Waiting for SOLUSDT instruments to be subscribed...", flush=True) + print("", flush=True) + + # Wait until addon is turned off + bm.wait_until_addon_is_turned_off(addon) + + print("", flush=True) + print("SOL Walls Snapshot addon stopped.", flush=True) \ No newline at end of file diff --git a/examples/xrp_walls_snapshot.py b/examples/xrp_walls_snapshot.py new file mode 100644 index 0000000..573804b --- /dev/null +++ b/examples/xrp_walls_snapshot.py @@ -0,0 +1,355 @@ +""" +XRP Walls Snapshot - Immediate Order Book Liquidity Snapshot Exporter + +This Bookmap addon captures a single snapshot of XRP/USDT order book liquidity (supports both XRPUSDT and XRP-USDT formats) +immediately after subscription and exports to a user-configured CSV file. + +Features: +- One-time snapshot capture (no periodic exports) +- Monitors bid/ask liquidity across 551 price buckets (0.5 to 6.0 USD) +- Each bucket represents a $0.01 range at exact price levels +- Configurable CSV output path via settings UI +- Exports immediately after order book snapshot is received +- Default export directory: /Users/dannyalmaden/python_projects/python-api-bookmap/data +- Default file format: xrpusdt_snapshot.csv + +Bucket Behavior: +- Bucket 0.50: Aggregates orders from $0.50 to $0.5099 +- Bucket 2.00: Aggregates orders from $2.00 to $2.0099 +- Bucket 2.01: Aggregates orders from $2.01 to $2.0199 +- Bucket 2.02: Aggregates orders from $2.02 to $2.0299 + +CSV Format: + price,bid_xrp_qty,ask_xrp_qty + 0.50,155000.7,12000.3 + 0.55,150000.3,15000.8 + ... + +Author: Bookmap Python API +""" + +import bookmap as bm +import csv +import threading +import time +import os +from pathlib import Path +from collections import defaultdict +from datetime import datetime + +# Global data structures +instrument_data = {} + +# Configuration constants +BUCKET_SIZE = 0.01 # USD (1 cent) +MIN_BUCKET = 0.5 # USD +MAX_BUCKET = 6.0 # USD +SNAPSHOT_DELAY = 3.0 # seconds to wait for snapshot completion after subscription + + +def handle_subscribe_instrument(addon, alias, full_name, is_crypto, pips, size_multiplier, instrument_multiplier, supported_features): + """ + Called when an instrument is subscribed in Bookmap. + Only processes instruments containing 'XRPUSDT' in their alias. + + Args: + addon: The addon instance + alias: Instrument alias/symbol + full_name: Full instrument name + is_crypto: Boolean indicating if this is a crypto instrument + pips: Price tick size multiplier + size_multiplier: Size/volume multiplier for converting to actual quantities + instrument_multiplier: Additional instrument multiplier + supported_features: Features supported by this instrument + """ + # Filter: Only process XRP/USDT instruments (supports both XRPUSDT and XRP-USDT formats) + # Check if alias contains either XRPUSDT or XRP-USDT + is_xrp_usdt = "XRPUSDT" in alias or "XRP-USDT" in alias + if not is_xrp_usdt: + print(f"Skipping non-XRP/USDT instrument: {alias}", flush=True) + return + + print(f"Subscribing to XRP Walls Snapshot for: {alias}", flush=True) + + # Initialize instrument data structure with timestamped CSV path + csv_path_default = f"/Users/dannyalmaden/python_projects/python-api-bookmap/data/xrpusdt_snapshot.csv" + + instrument_data[alias] = { + "full_name": full_name, + "is_crypto": is_crypto, + "pips": pips, + "size_multiplier": size_multiplier, + "instrument_multiplier": instrument_multiplier, + "order_book": bm.create_order_book(), + "csv_path": csv_path_default, + "snapshot_received": False, + "snapshot_timer": None + } + + # Add settings parameter for CSV output path + bm.add_string_settings_parameter(addon, alias, "CSV Output Path", csv_path_default) + + # Subscribe to depth data + req_id = hash(alias) % 1000000 # Generate unique request ID + bm.subscribe_to_depth(addon, alias, req_id) + + # Start timer to export snapshot after depth subscription + timer = threading.Timer(SNAPSHOT_DELAY, export_snapshot_to_csv, args=[addon, alias]) + timer.daemon = True + timer.start() + instrument_data[alias]["snapshot_timer"] = timer + + print(f"Successfully subscribed to {alias} - Snapshot will export in {SNAPSHOT_DELAY}s", flush=True) + print(f"Snapshot will be exported to: {csv_path_default}", flush=True) + + +def handle_unsubscribe_instrument(addon, alias): + """ + Called when an instrument is unsubscribed. + Cleans up resources and cancels timers. + + Args: + addon: The addon instance + alias: Instrument alias being unsubscribed + """ + print(f"Unsubscribing from: {alias}", flush=True) + + # Cancel snapshot timer if exists + if alias in instrument_data and instrument_data[alias]["snapshot_timer"] is not None: + instrument_data[alias]["snapshot_timer"].cancel() + + # Clean up data structures + if alias in instrument_data: + del instrument_data[alias] + + + print(f"Cleanup complete for: {alias}", flush=True) + + +def on_depth(addon, alias, is_bid, price_level, size_level): + """ + Handler for depth updates. Updates the order book with incoming depth data. + + Args: + addon: The addon instance + alias: Instrument alias + is_bid: True for bid/buy side, False for ask/sell side + price_level: Integer price level (must be multiplied by pips) + size_level: Integer size level (must be divided by size_multiplier) + """ + # Skip if instrument not tracked + if alias not in instrument_data: + return + + # Update internal order book + order_book = instrument_data[alias]["order_book"] + bm.on_depth(order_book, is_bid, price_level, size_level) + + + +def on_settings_change(addon, alias, setting_name, field_type, new_value): + """ + Handler for settings parameter changes. Updates CSV path when user changes it. + + Args: + addon: The addon instance + alias: Instrument alias + setting_name: Name of the setting that changed + field_type: Type of the field (string, int, etc.) + new_value: New value of the setting + """ + if setting_name == "CSV Output Path" and alias in instrument_data: + old_path = instrument_data[alias]["csv_path"] + instrument_data[alias]["csv_path"] = new_value + print(f"CSV output path updated for {alias}:", flush=True) + print(f" Old: {old_path}", flush=True) + print(f" New: {new_value}", flush=True) + + +def recalculate_liquidity_buckets(alias, order_book, pips, size_multiplier): + """ + Calculates aggregated liquidity for all price buckets by scanning + the entire order book. + + Each bucket only aggregates orders within a $0.01 range at exact price levels. + For example: + - Bucket 0.50: Only includes orders from $0.50 to $0.5099 + - Bucket 2.00: Only includes orders from $2.00 to $2.0099 + - Bucket 2.01: Only includes orders from $2.01 to $2.0199 + - Bucket 2.02: Only includes orders from $2.02 to $2.0299 + + Args: + alias: Instrument alias + order_book: Order book dictionary with 'bids' and 'asks' SortedDicts + pips: Price multiplier for converting price levels to actual prices + size_multiplier: Divisor for converting size levels to actual XRP quantities + + Returns: + Dictionary mapping bucket prices to {'bid': float, 'ask': float} + """ + # Initialize all buckets to zero + liquidity_data = {} + + # Generate buckets using integer range to avoid floating-point issues + num_buckets = int((MAX_BUCKET - MIN_BUCKET) / BUCKET_SIZE) + 1 + for i in range(num_buckets): + bucket = round(MIN_BUCKET + i * BUCKET_SIZE, 2) + liquidity_data[bucket] = {"bid": 0.0, "ask": 0.0} + + # Aggregate bid liquidity + for price_level, size_level in order_book["bids"].items(): + actual_price = price_level * pips + bucket = round(actual_price / BUCKET_SIZE) * BUCKET_SIZE + bucket = round(bucket, 2) # Round to 2 decimal places + + # Only include orders where the price matches the bucket exactly + # e.g., for bucket 2.00, only prices from 2.00 to 2.0099 + if MIN_BUCKET <= bucket <= MAX_BUCKET and size_level > 0: + # Check if price is within $0.01 of the bucket (bucket to bucket+0.0099) + if bucket <= actual_price < bucket + 0.01: + xrp_qty = size_level / size_multiplier + liquidity_data[bucket]["bid"] += xrp_qty + + # Aggregate ask liquidity + for price_level, size_level in order_book["asks"].items(): + actual_price = price_level * pips + bucket = round(actual_price / BUCKET_SIZE) * BUCKET_SIZE + bucket = round(bucket, 2) # Round to 2 decimal places + + # Only include orders where the price matches the bucket exactly + # e.g., for bucket 2.00, only prices from 2.00 to 2.0099 + if MIN_BUCKET <= bucket <= MAX_BUCKET and size_level > 0: + # Check if price is within $0.01 of the bucket (bucket to bucket+0.0099) + if bucket <= actual_price < bucket + 0.01: + xrp_qty = size_level / size_multiplier + liquidity_data[bucket]["ask"] += xrp_qty + + return liquidity_data + + +def export_snapshot_to_csv(addon, alias): + """ + Exports order book snapshot to CSV file. Only exports once per subscription. + + Args: + addon: The addon instance + alias: Instrument alias + """ + if alias not in instrument_data: + print(f"No instrument data available for {alias}", flush=True) + return + + # Check if already exported + if instrument_data[alias]["snapshot_received"]: + print(f"Snapshot already exported for {alias}", flush=True) + return + + # Get instrument data + csv_path = instrument_data[alias]["csv_path"] + order_book = instrument_data[alias]["order_book"] + pips = instrument_data[alias]["pips"] + size_multiplier = instrument_data[alias]["size_multiplier"] + + # Create the data directory if it doesn't exist + csv_dir = os.path.dirname(csv_path) + if csv_dir and not os.path.exists(csv_dir): + try: + os.makedirs(csv_dir, exist_ok=True) + print(f"Created directory: {csv_dir}", flush=True) + except Exception as e: + print(f"ERROR: Failed to create output directory {csv_dir}: {e}", flush=True) + return + elif csv_dir: + print(f"Output directory: {csv_dir}", flush=True) + + # Check if order book has data + bid_count = len(order_book["bids"]) + ask_count = len(order_book["asks"]) + + print(f"Capturing snapshot for {alias}:", flush=True) + print(f" Order book depth: {bid_count} bids, {ask_count} asks", flush=True) + + if bid_count == 0 and ask_count == 0: + print(f"WARNING: Order book is empty for {alias}. Snapshot may not be complete.", flush=True) + + try: + # Calculate aggregated liquidity buckets + liquidity_data = recalculate_liquidity_buckets(alias, order_book, pips, size_multiplier) + + # Write to CSV + with open(csv_path, 'w', newline='') as csvfile: + writer = csv.writer(csvfile) + + # Write header + writer.writerow(['price', 'bid_xrp_qty', 'ask_xrp_qty']) + + # Write data for all buckets in order + num_buckets = int((MAX_BUCKET - MIN_BUCKET) / BUCKET_SIZE) + 1 + for i in range(num_buckets): + bucket = round(MIN_BUCKET + i * BUCKET_SIZE, 2) + bid_qty = liquidity_data.get(bucket, {}).get("bid", 0.0) + ask_qty = liquidity_data.get(bucket, {}).get("ask", 0.0) + + # Format bucket price to 2 decimal places and quantities to 4 decimal places + writer.writerow([f"{bucket:.2f}", f"{bid_qty:.4f}", f"{ask_qty:.4f}"]) + + # Mark snapshot as received + instrument_data[alias]["snapshot_received"] = True + + print(f"✓ Snapshot successfully exported to: {csv_path}", flush=True) + print(f" Total buckets: {len(liquidity_data)}", flush=True) + + # Calculate total liquidity for summary + total_bid_xrp = sum(data["bid"] for data in liquidity_data.values()) + total_ask_xrp = sum(data["ask"] for data in liquidity_data.values()) + print(f" Total bid liquidity: {total_bid_xrp:.4f} XRP", flush=True) + print(f" Total ask liquidity: {total_ask_xrp:.4f} XRP", flush=True) + + except PermissionError as e: + print(f"ERROR: Permission denied writing to {csv_path}", flush=True) + print(f" Please check file permissions or change the CSV path in settings.", flush=True) + except OSError as e: + print(f"ERROR: Invalid file path {csv_path}", flush=True) + print(f" {str(e)}", flush=True) + print(f" Please update the CSV path in addon settings.", flush=True) + except Exception as e: + print(f"ERROR: Failed to export snapshot for {alias}: {str(e)}", flush=True) + + +if __name__ == "__main__": + print("=" * 70, flush=True) + print("XRP Walls Snapshot - Immediate Order Book Liquidity Exporter", flush=True) + print("=" * 70, flush=True) + print(f"Configuration:", flush=True) + print(f" Price range: ${MIN_BUCKET:.2f} - ${MAX_BUCKET:.2f} USD", flush=True) + print(f" Bucket interval: ${BUCKET_SIZE:.2f}", flush=True) + print(f" Aggregation range: $0.01 per bucket (exact price levels)", flush=True) + print(f" Total buckets: {int((MAX_BUCKET - MIN_BUCKET) / BUCKET_SIZE) + 1} (551 buckets)", flush=True) + print(f" Snapshot delay: {SNAPSHOT_DELAY} seconds", flush=True) + print("", flush=True) + print("Behavior:", flush=True) + print(" • Only tracks instruments containing 'XRPUSDT' or 'XRP-USDT'", flush=True) + print(" • Captures ONE snapshot per subscription", flush=True) + print(" • Exports to CSV immediately after snapshot received", flush=True) + print(" • CSV path configurable via addon settings", flush=True) + print("=" * 70, flush=True) + + # Create addon instance + addon = bm.create_addon() + + # Register handlers + bm.add_depth_handler(addon, on_depth) + bm.add_on_setting_change_handler(addon, on_settings_change) + + # Start addon with subscribe/unsubscribe handlers + bm.start_addon(addon, handle_subscribe_instrument, handle_unsubscribe_instrument) + + print("XRP Walls Snapshot addon started successfully!", flush=True) + print("Waiting for XRPUSDT instruments to be subscribed...", flush=True) + print("", flush=True) + + # Wait until addon is turned off + bm.wait_until_addon_is_turned_off(addon) + + print("", flush=True) + print("XRP Walls Snapshot addon stopped.", flush=True) \ No newline at end of file