add&impl 006-logseq-cli-import-export.md

This commit is contained in:
rcmerci
2026-01-19 15:39:42 +08:00
parent 83784c6301
commit 91f0cd7c09
12 changed files with 783 additions and 24 deletions

View File

@@ -0,0 +1,83 @@
# Logseq CLI Import/Export Plan
Goal: Add logseq-cli support for import/export with EDN and SQLite formats using the existing db-worker-node server.
Architecture: Extend logseq-cli command parsing and execution to invoke db-worker-node thread APIs for export and import, with minimal new APIs to handle EDN import and SQLite binary payloads over HTTP.
Tech Stack: ClojureScript, babashka/cli, db-worker-node HTTP /v1/invoke, datascript, sqlite-export helpers, Node fs/path.
Related: Builds on docs/agent-guide/004-logseq-cli-verb-subcommands.md and docs/agent-guide/003-db-worker-node-cli-orchestration.md.
## Requirements
- Import types: edn, sqlite.
- Export types: edn, sqlite.
- CLI must work against db-worker-node with repo binding and lock file behavior.
- Output files must be written to user-specified paths with clear success/error messages.
## Proposed CLI UX
Prefer graph-scoped subcommands to keep import/export with graph management:
- `logseq graph export --type edn --output <path> [--repo <graph>]`
- `logseq graph export --type sqlite --output <path> [--repo <graph>]`
- `logseq graph import --type edn --input <path> --repo <graph>`
- `logseq graph import --type sqlite --input <path> --repo <graph>`
Notes:
- `graph import` only supports importing into a new graph name; it must not overwrite an existing graph.
- `--repo` is required for import, and required unless the current graph is set in config for export.
## Current Capabilities (Baseline)
- db-worker-node supports `:thread-api/export-edn`, `:thread-api/export-db`, and `:thread-api/import-db`.
- logseq-cli can invoke db-worker-node via `src/main/logseq/cli/transport.cljs` and write files with `transport/write-output`.
- EDN import exists in the app layer (`frontend.handler.db-based.import`) but not in db-worker-node.
- SQLite import in db-worker-node writes the sqlite file, but CLI needs a full reload step to reflect new data.
## Implementation Plan
1. Review existing CLI command table and action pipeline in `src/main/logseq/cli/commands.cljs` and `src/main/logseq/cli/main.cljs` to locate insertion points for new graph import/export actions.
2. Add new CLI specs for import/export flags (type, input, output, export-type, mode) in `src/main/logseq/cli/commands.cljs`.
3. Extend the command table with `graph import` and `graph export` entries and ensure help output includes them.
4. Add action builders for import/export that validate repo presence, file paths, and allowed types (edn, sqlite).
5. Add CLI helpers for reading input files and writing output files in `src/main/logseq/cli/transport.cljs` (or a new helper namespace), keeping the existing `write-output` behavior for EDN and DB files.
6. Implement export execution:
- EDN: call `thread-api/export-edn` (graph-only) and write EDN file.
- SQLite: call a new db-worker-node export API that returns a base64 string or transit-safe binary; decode to Buffer and write `.sqlite` file.
7. Implement import execution:
- EDN: read EDN file, pass data to a new db-worker-node `thread-api/import-edn` (see below), and return a summary message.
- SQLite: read file as Buffer, pass to a new db-worker-node `thread-api/import-sqlite` (or reuse `import-db` with a wrapper that closes/reopens the repo).
- Always stop and restart the db-worker-node server from the CLI around import to ensure a clean reload.
8. Add db-worker-node thread APIs in `src/main/frontend/worker/db_core.cljs`:
- `:thread-api/import-edn` to convert export EDN into tx data via `logseq.db.sqlite.export/build-import` and transact with `:tx-meta` including `::sqlite-export/imported-data? true` so the pipeline rebuilds refs.
- `:thread-api/export-db-base64` (or similar) to return a base64 string for SQLite export over HTTP.
- `:thread-api/import-db-base64` (or similar) to accept base64 input, close existing sqlite connections, import db data, and reopen the repo (or invoke `:thread-api/create-or-open-db` with `:import-type :sqlite-db`).
9. Update db-worker-node server validation (repo binding) if the new thread APIs need special argument shapes.
10. Update CLI output formatting in `src/main/logseq/cli/format.cljs` to print concise success lines like `Exported <type> to <path>` and `Imported <type> from <path>`.
11. Update documentation in `docs/cli/logseq-cli.md` with new commands, examples, and file format notes.
## Testing Plan
- Add CLI parsing tests for `graph import` and `graph export` options in `src/test/logseq/cli/commands_test.cljs` (or a new namespace).
- Add integration tests in `src/test/logseq/cli/integration_test.cljs` to:
- export EDN and SQLite from a test graph and assert output files exist and are non-empty.
- import EDN into a new graph and verify a known page/block exists via CLI `show` or `list`.
- import SQLite into a new graph and verify graph metadata or page count.
- Add db-worker-node tests in `src/test/frontend/worker/db_worker_node_test.cljs` for the new import/export thread APIs (EDN build-import path and base64 DB export/import).
- Follow @test-driven-development: write failing tests before implementation.
## Edge Cases
- Large SQLite exports may exceed JSON limits if not base64/transit encoded; ensure streaming-safe or chunked base64 handling.
- Import should fail fast if the repo is missing and `--repo` is not provided, or if input file does not exist.
- SQLite import while the repo is open must close/reopen connections to avoid stale datascript state.
- EDN import should validate the export shape and surface readable errors when EDN is invalid or incompatible.
- Overwrite behavior should be explicit for SQLite imports to prevent accidental data loss.
## Decisions
1. `graph import` only imports into a new graph; it must not overwrite an existing graph.
2. No `--mode` flag; both EDN and SQLite imports are replace-style imports.
3. CLI always stops and restarts db-worker-node around imports.
4. `graph export --type edn` is graph-only for now (no page/view/blocks).

View File

@@ -47,8 +47,10 @@ Graph commands:
- `graph remove --repo <name>` - remove a graph
- `graph validate --repo <name>` - validate graph data
- `graph info [--repo <name>]` - show graph metadata (defaults to current graph)
- `graph export --type edn|sqlite --output <path> [--repo <name>]` - export a graph to EDN or SQLite
- `graph import --type edn|sqlite --input <path> --repo <name>` - import a graph from EDN or SQLite (new graph only)
For any command that requires `--repo`, if the target graph does not exist, the CLI returns `graph not exists` (except for `graph create`).
For any command that requires `--repo`, if the target graph does not exist, the CLI returns `graph not exists` (except for `graph create`). `graph import` fails if the target graph already exists.
Server commands:
- `server list` - list running db-worker-node servers
@@ -92,6 +94,7 @@ Options grouping:
Output formats:
- Global `--output <human|json|edn>` (also accepted per subcommand)
- For `graph export`, `--output` refers to the destination file path. Output formatting is controlled via `:output-format` in config or `LOGSEQ_CLI_OUTPUT`.
- Human output is plain text. List/search commands render tables with a final `Count: N` line. For list subcommands, the ID column uses `:db/id` (not UUID). If `:db/ident` exists, an `IDENT` column is included. Times such as list `UPDATED-AT`/`CREATED-AT` and `graph info` `Created at` are shown in human-friendly relative form. Errors include error codes and may include a `Hint:` line. Use `--output json|edn` for structured output.
- `show` human output prints the `:db/id` as the first column followed by a tree:
@@ -110,6 +113,8 @@ Examples:
```bash
node ./static/logseq-cli.js graph create --repo demo
node ./static/logseq-cli.js graph export --type edn --output /tmp/demo.edn --repo demo
node ./static/logseq-cli.js graph import --type edn --input /tmp/demo.edn --repo demo-import
node ./static/logseq-cli.js add block --page TestPage --content "hello world"
node ./static/logseq-cli.js search --text "hello"
node ./static/logseq-cli.js show --page-name TestPage --format json --output json

View File

@@ -579,6 +579,17 @@
(p/let [data (<export-db-file repo)]
(platform/transfer (platform/current) data #js [(.-buffer data)])))
(def-thread-api :thread-api/export-db-base64
[repo]
(when-let [^js db (worker-state/get-sqlite-conn repo :db)]
(.exec db "PRAGMA wal_checkpoint(2)"))
(p/let [data (<export-db-file repo)]
(when data
(let [buffer (if (instance? js/Buffer data)
data
(js/Buffer.from data))]
(.toString buffer "base64")))))
(def-thread-api :thread-api/export-debug-log-db
[repo]
(when-let [^js db (worker-state/get-sqlite-conn repo :debug-log)]
@@ -603,6 +614,16 @@
(<import-db pool data)
nil)))
(def-thread-api :thread-api/import-db-base64
[repo base64]
(when-not (string/blank? repo)
(p/let [pool (<get-opfs-pool repo)
data (js/Buffer.from base64 "base64")
_ (close-db! repo)
_ (<import-db pool data)
_ (start-db! repo {:import-type :sqlite-db})]
nil)))
(def-thread-api :thread-api/search-blocks
[repo q option]
(search-blocks repo q option))
@@ -695,6 +716,17 @@
:error])
{:export-edn-error (.-message e)}))))
(def-thread-api :thread-api/import-edn
[repo export-edn]
(let [conn (worker-state/get-datascript-conn repo)]
(when-not conn
(throw (ex-info "graph not opened" {:code :graph-not-opened :repo repo})))
(let [{:keys [init-tx block-props-tx misc-tx]} (sqlite-export/build-import export-edn @conn {})
tx-data (vec (concat init-tx block-props-tx misc-tx))
tx-meta {::sqlite-export/imported-data? true}]
(ldb/transact! conn tx-data tx-meta)
{:tx-count (count tx-data)})))
(def-thread-api :thread-api/get-view-data
[repo view-id option]
(let [db @(worker-state/get-datascript-conn repo)]

View File

@@ -110,6 +110,14 @@
:coerce :long}
:format {:desc "Output format (text, json, edn)"}})
(def ^:private graph-export-spec
{:type {:desc "Export type (edn, sqlite)"}
:output {:desc "Output path"}})
(def ^:private graph-import-spec
{:type {:desc "Import type (edn, sqlite)"}
:input {:desc "Input path"}})
(defn- format-commands
[table]
(let [rows (->> table
@@ -228,6 +236,27 @@
:message "page name is required"}
:summary summary})
(defn- missing-type-result
[summary]
{:ok? false
:error {:code :missing-type
:message "type is required"}
:summary summary})
(defn- missing-input-result
[summary]
{:ok? false
:error {:code :missing-input
:message "input is required"}
:summary summary})
(defn- missing-output-result
[summary]
{:ok? false
:error {:code :missing-output
:message "output is required"}
:summary summary})
(defn- missing-search-result
[summary]
{:ok? false
@@ -266,6 +295,13 @@
(def ^:private search-types
#{"page" "block" "tag" "property" "all"})
(def ^:private import-export-types
#{"edn" "sqlite"})
(defn- normalize-import-export-type
[value]
(some-> value string/lower-case string/trim))
(defn- invalid-list-options?
[command opts]
(let [{:keys [order include-journal journal-only]} opts
@@ -337,6 +373,8 @@
(command-entry ["graph" "remove"] :graph-remove "Remove graph" {})
(command-entry ["graph" "validate"] :graph-validate "Validate graph" {})
(command-entry ["graph" "info"] :graph-info "Graph metadata" {})
(command-entry ["graph" "export"] :graph-export "Export graph" graph-export-spec)
(command-entry ["graph" "import"] :graph-import "Import graph" graph-import-spec)
(command-entry ["server" "list"] :server-list "List db-worker-node servers" {})
(command-entry ["server" "status"] :server-status "Show server status for a graph" server-spec)
(command-entry ["server" "start"] :server-start "Start db-worker-node for a graph" server-spec)
@@ -448,6 +486,29 @@
(and (= command :search) (invalid-search-options? opts))
(invalid-options-result summary (invalid-search-options? opts))
(and (= command :graph-export) (not (seq (normalize-import-export-type (:type opts)))))
(missing-type-result summary)
(and (= command :graph-export) (not (seq (:output opts))))
(missing-output-result summary)
(and (= command :graph-export)
(not (contains? import-export-types (normalize-import-export-type (:type opts)))))
(invalid-options-result summary (str "invalid type: " (:type opts)))
(and (= command :graph-import) (not (seq (normalize-import-export-type (:type opts)))))
(missing-type-result summary)
(and (= command :graph-import) (not (seq (:input opts))))
(missing-input-result summary)
(and (= command :graph-import) (not (seq (:repo opts))))
(missing-repo-result summary)
(and (= command :graph-import)
(not (contains? import-export-types (normalize-import-export-type (:type opts)))))
(invalid-options-result summary (str "invalid type: " (:type opts)))
(and (#{:server-status :server-start :server-stop :server-restart} command)
(not (seq (:repo opts))))
(missing-repo-result summary)
@@ -521,6 +582,18 @@
:message "graph not exists"}}))
(p/resolved {:ok? true})))
(defn- ensure-missing-graph
[action config]
(if (and (= :graph-import (:type action)) (:repo action))
(p/let [graphs (cli-server/list-graphs config)
graph (repo->graph (:repo action))]
(if (some #(= graph %) graphs)
{:ok? false
:error {:code :graph-exists
:message "graph already exists"}}
{:ok? true}))
(p/resolved {:ok? true})))
(defn- pick-graph
[options command-args config]
(or (:repo options)
@@ -1044,6 +1117,30 @@
(:graph-list :graph-create :graph-switch :graph-remove :graph-validate :graph-info)
(build-graph-action command graph repo)
:graph-export
(let [export-type (normalize-import-export-type (:type options))]
(if-not (seq repo)
(missing-repo-error "repo is required for export")
{:ok? true
:action {:type :graph-export
:repo repo
:graph (repo->graph repo)
:export-type export-type
:output (:output options)}}))
:graph-import
(let [import-repo (resolve-repo (:repo options))
import-type (normalize-import-export-type (:type options))]
(if-not (seq import-repo)
(missing-repo-error "repo is required for import")
{:ok? true
:action {:type :graph-import
:repo import-repo
:graph (repo->graph import-repo)
:import-type import-type
:input (:input options)
:allow-missing-graph true}}))
(:server-list :server-status :server-start :server-stop :server-restart)
(build-server-action command server-repo)
@@ -1119,6 +1216,51 @@
:logseq.kv/graph-created-at (:kv/value created)
:logseq.kv/schema-version (:kv/value schema)}})))
(defn- execute-graph-export
[action config]
(-> (p/let [cfg (cli-server/ensure-server! config (:repo action))
export-type (:export-type action)
export-result (case export-type
"edn"
(transport/invoke cfg
"thread-api/export-edn"
false
[(:repo action) {:export-type :graph}])
"sqlite"
(transport/invoke cfg
"thread-api/export-db-base64"
true
[(:repo action)])
(throw (ex-info "unsupported export type" {:export-type export-type})))
data (if (= export-type "sqlite")
(js/Buffer.from export-result "base64")
export-result)
format (if (= export-type "sqlite") :sqlite :edn)]
(transport/write-output {:format format :path (:output action) :data data})
{:status :ok
:data {:message (str "wrote " (:output action))}})))
(defn- execute-graph-import
[action config]
(-> (p/let [_ (cli-server/stop-server! config (:repo action))
cfg (cli-server/ensure-server! config (:repo action))
import-type (:import-type action)
input-data (case import-type
"edn" (transport/read-input {:format :edn :path (:input action)})
"sqlite" (transport/read-input {:format :sqlite :path (:input action)})
(throw (ex-info "unsupported import type" {:import-type import-type})))
payload (if (= import-type "sqlite")
(.toString (js/Buffer.from input-data) "base64")
input-data)
method (if (= import-type "sqlite")
"thread-api/import-db-base64"
"thread-api/import-edn")
direct-pass? (= import-type "sqlite")
_ (transport/invoke cfg method direct-pass? [(:repo action) payload])
_ (cli-server/restart-server! config (:repo action))]
{:status :ok
:data {:message (str "imported " import-type " from " (:input action))}})))
(defn- execute-list-page
[action config]
(-> (p/let [cfg (cli-server/ensure-server! config (:repo action))
@@ -1208,7 +1350,7 @@
[?e :block/uuid ?uuid]
[(get-else $ ?e :block/updated-at 0) ?updated]
[(get-else $ ?e :block/created-at 0) ?created]
[(string/includes? ?title ?q)]]
[(clojure.string/includes? ?title ?q)]]
'[:find ?e ?title ?uuid ?updated ?created
:in $ ?q
:where
@@ -1217,10 +1359,12 @@
[?e :block/uuid ?uuid]
[(get-else $ ?e :block/updated-at 0) ?updated]
[(get-else $ ?e :block/created-at 0) ?created]
[(string/includes? (string/lower-case ?title) ?q)]])
[(clojure.string/includes? (clojure.string/lower-case ?title) ?q)]])
q* (if case-sensitive? text (string/lower-case text))]
(transport/invoke cfg "thread-api/q" false [repo [query q*]])))
#_{:clj-kondo/ignore [:aliased-namespace-symbol]}
(defn- query-blocks
[cfg repo text case-sensitive? tag include-content?]
(let [has-tag? (seq tag)
@@ -1237,7 +1381,7 @@
[?e :block/uuid ?uuid]
[(get-else $ ?e :block/updated-at 0) ?updated]
[(get-else $ ?e :block/created-at 0) ?created]
[(string/includes? ?value ?q)]]
[(clojure.string/includes? ?value ?q)]]
case-sensitive?
`[:find ?e ?value ?uuid ?updated ?created
@@ -1248,7 +1392,7 @@
[?e :block/uuid ?uuid]
[(get-else $ ?e :block/updated-at 0) ?updated]
[(get-else $ ?e :block/created-at 0) ?created]
[(string/includes? ?value ?q)]]
[(clojure.string/includes? ?value ?q)]]
has-tag?
`[:find ?e ?value ?uuid ?updated ?created
@@ -1261,7 +1405,7 @@
[?e :block/uuid ?uuid]
[(get-else $ ?e :block/updated-at 0) ?updated]
[(get-else $ ?e :block/created-at 0) ?created]
[(string/includes? (string/lower-case ?value) ?q)]]
[(clojure.string/includes? (clojure.string/lower-case ?value) ?q)]]
:else
`[:find ?e ?value ?uuid ?updated ?created
@@ -1272,7 +1416,7 @@
[?e :block/uuid ?uuid]
[(get-else $ ?e :block/updated-at 0) ?updated]
[(get-else $ ?e :block/created-at 0) ?created]
[(string/includes? (string/lower-case ?value) ?q)]])
[(clojure.string/includes? (clojure.string/lower-case ?value) ?q)]])
q* (if case-sensitive? text (string/lower-case text))
tag-name (some-> tag string/lower-case)]
(if has-tag?
@@ -1424,15 +1568,25 @@
(defn execute
[action config]
(-> (p/let [check (ensure-existing-graph action config)
result (if-not (:ok? check)
(-> (p/let [missing-check (ensure-missing-graph action config)
check (ensure-existing-graph action config)
result (cond
(not (:ok? missing-check))
{:status :error
:error (:error missing-check)}
(not (:ok? check))
{:status :error
:error (:error check)}
:else
(case (:type action)
:graph-list (execute-graph-list action config)
:invoke (execute-invoke action config)
:graph-switch (execute-graph-switch action config)
:graph-info (execute-graph-info action config)
:graph-export (execute-graph-export action config)
:graph-import (execute-graph-import action config)
:list-page (execute-list-page action config)
:list-tag (execute-list-tag action config)
:list-property (execute-list-property action config)

View File

@@ -218,6 +218,14 @@
[{:keys [repo block]}]
(str "Removed block: " block " (repo: " repo ")"))
(defn- format-graph-export
[{:keys [export-type output]}]
(str "Exported " export-type " to " output))
(defn- format-graph-import
[{:keys [import-type input]}]
(str "Imported " import-type " from " input))
(defn- format-graph-action
[command {:keys [graph]}]
(let [verb (case command
@@ -248,6 +256,8 @@
:add-page (format-add-page context)
:remove-page (format-remove-page context)
:remove-block (format-remove-block context)
:graph-export (format-graph-export context)
:graph-import (format-graph-import context)
:search (format-search-results (:results data))
:show (or (:message data) (pr-str data))
(if (and (map? data) (contains? data :message))

View File

@@ -12,7 +12,7 @@
(string/join "\n"
["logseq <command> [options]"
""
"Commands: list page, list tag, list property, add block, add page, remove block, remove page, search, show, graph list, graph create, graph switch, graph remove, graph validate, graph info, server list, server status, server start, server stop, server restart"
"Commands: list page, list tag, list property, add block, add page, remove block, remove page, search, show, graph list, graph create, graph switch, graph remove, graph validate, graph info, graph export, graph import, server list, server status, server start, server stop, server restart"
""
"Options:"
summary]))
@@ -52,9 +52,16 @@
{:exit-code 0
:output (format/format-result result opts)})))
(p/catch (fn [error]
(let [message (or (some-> (ex-data error) :message)
(.-message error)
(str error))]
(let [data (ex-data error)
message (cond
(and (= :http-error (:code data)) (seq (:body data)))
(str "http request failed (" (:status data) "): " (:body data))
(some? (:message data))
(:message data)
:else
(or (.-message error) (str error)))]
{:exit-code 1
:output (format/format-result {:status :error
:error {:code :exception

View File

@@ -1,6 +1,7 @@
(ns logseq.cli.transport
"HTTP transport for communicating with db-worker-node."
(:require [clojure.string :as string]
(:require [cljs.reader :as reader]
[clojure.string :as string]
[logseq.db :as ldb]
[promesa.core :as p]
["fs" :as fs]
@@ -120,4 +121,24 @@
(js/Buffer.from data))]
(fs/writeFileSync path buffer))
:sqlite
(let [buffer (if (instance? js/Buffer data)
data
(js/Buffer.from data))]
(fs/writeFileSync path buffer))
(throw (ex-info "unsupported output format" {:format format}))))
(defn read-input
[{:keys [format path]}]
(case format
:edn
(reader/read-string (.toString (fs/readFileSync path) "utf8"))
:db
(fs/readFileSync path)
:sqlite
(fs/readFileSync path)
(throw (ex-info "unsupported input format" {:format format}))))

View File

@@ -252,6 +252,127 @@
(done))))
(done))))))))
(deftest db-worker-node-import-edn
(async done
(let [daemon-a (atom nil)
daemon-b (atom nil)
data-dir (node-helper/create-tmp-dir "db-worker-import-edn")
repo-a (str "logseq_db_import_edn_a_" (subs (str (random-uuid)) 0 8))
repo-b (str "logseq_db_import_edn_b_" (subs (str (random-uuid)) 0 8))
now (js/Date.now)
page-uuid (random-uuid)]
(-> (p/let [{:keys [host port stop!]}
(db-worker-node/start-daemon! {:data-dir data-dir
:repo repo-a})
_ (reset! daemon-a {:stop! stop!})
_ (invoke host port "thread-api/create-or-open-db" [repo-a {}])
_ (invoke host port "thread-api/transact"
[repo-a
[{:block/uuid page-uuid
:block/title "Import Page"
:block/name "import-page"
:block/tags #{:logseq.class/Page}
:block/created-at now
:block/updated-at now}]
{}
nil])
export-edn (invoke host port "thread-api/export-edn" [repo-a {:export-type :graph}])]
(is (map? export-edn))
(p/let [_ ((:stop! @daemon-a))
{:keys [host port stop!]}
(db-worker-node/start-daemon! {:data-dir data-dir
:repo repo-b})
_ (reset! daemon-b {:stop! stop!})
_ (invoke host port "thread-api/create-or-open-db" [repo-b {}])
_ (invoke host port "thread-api/import-edn" [repo-b export-edn])
result (invoke host port "thread-api/q"
[repo-b
['[:find ?e
:in $ ?title
:where [?e :block/title ?title]]
"Import Page"]])]
(is (seq result))))
(p/catch (fn [e]
(println "[db-worker-node-test] import-edn error:" e)
(is false (str e))))
(p/finally (fn []
(let [stop-a (:stop! @daemon-a)
stop-b (:stop! @daemon-b)]
(cond
(and stop-a stop-b)
(-> (stop-a)
(p/finally (fn [] (-> (stop-b) (p/finally (fn [] (done)))))))
stop-a
(-> (stop-a) (p/finally (fn [] (done))))
stop-b
(-> (stop-b) (p/finally (fn [] (done))))
:else
(done)))))))))
(deftest db-worker-node-import-db-base64
(async done
(let [daemon-a (atom nil)
daemon-b (atom nil)
data-dir (node-helper/create-tmp-dir "db-worker-import-sqlite")
repo-a (str "logseq_db_import_sqlite_a_" (subs (str (random-uuid)) 0 8))
repo-b (str "logseq_db_import_sqlite_b_" (subs (str (random-uuid)) 0 8))
now (js/Date.now)
page-uuid (random-uuid)]
(-> (p/let [{:keys [host port stop!]}
(db-worker-node/start-daemon! {:data-dir data-dir
:repo repo-a})
_ (reset! daemon-a {:stop! stop!})
_ (invoke host port "thread-api/create-or-open-db" [repo-a {}])
_ (invoke host port "thread-api/transact"
[repo-a
[{:block/uuid page-uuid
:block/title "SQLite Import Page"
:block/name "sqlite-import-page"
:block/tags #{:logseq.class/Page}
:block/created-at now
:block/updated-at now}]
{}
nil])
export-base64 (invoke host port "thread-api/export-db-base64" [repo-a])]
(is (string? export-base64))
(is (pos? (count export-base64)))
(p/let [_ ((:stop! @daemon-a))
{:keys [host port stop!]}
(db-worker-node/start-daemon! {:data-dir data-dir
:repo repo-b})
_ (reset! daemon-b {:stop! stop!})
_ (invoke host port "thread-api/import-db-base64" [repo-b export-base64])
_ (invoke host port "thread-api/create-or-open-db" [repo-b {}])
result (invoke host port "thread-api/q"
[repo-b
['[:find ?e
:in $ ?title
:where [?e :block/title ?title]]
"SQLite Import Page"]])]
(is (seq result))))
(p/catch (fn [e]
(println "[db-worker-node-test] import-sqlite error:" e)
(is false (str e))))
(p/finally (fn []
(let [stop-a (:stop! @daemon-a)
stop-b (:stop! @daemon-b)]
(cond
(and stop-a stop-b)
(-> (stop-a)
(p/finally (fn [] (-> (stop-b) (p/finally (fn [] (done)))))))
stop-a
(-> (stop-a) (p/finally (fn [] (done))))
stop-b
(-> (stop-b) (p/finally (fn [] (done))))
:else
(done)))))))))
(deftest db-worker-node-repo-mismatch-test
(async done
(let [daemon (atom nil)

View File

@@ -3,6 +3,7 @@
[clojure.string :as string]
[logseq.cli.commands :as commands]
[logseq.cli.server :as cli-server]
[logseq.cli.transport :as transport]
[promesa.core :as p]))
(deftest test-help-output
@@ -31,7 +32,9 @@
summary (:summary result)]
(is (true? (:help? result)))
(is (string/includes? summary "graph list"))
(is (string/includes? summary "graph create"))))
(is (string/includes? summary "graph create"))
(is (string/includes? summary "graph export"))
(is (string/includes? summary "graph import"))))
(testing "list group shows subcommands"
(let [result (commands/parse-args ["list"])
@@ -303,6 +306,51 @@
(is (= :show (:command result)))
(is (= "Home" (get-in result [:options :page-name])))))
(testing "graph export parses with type and output"
(let [result (commands/parse-args ["graph" "export"
"--type" "edn"
"--output" "export.edn"])]
(is (true? (:ok? result)))
(is (= :graph-export (:command result)))
(is (= "edn" (get-in result [:options :type])))
(is (= "export.edn" (get-in result [:options :output])))))
(testing "graph import parses with type, input, and repo"
(let [result (commands/parse-args ["graph" "import"
"--type" "sqlite"
"--input" "import.sqlite"
"--repo" "demo"])]
(is (true? (:ok? result)))
(is (= :graph-import (:command result)))
(is (= "sqlite" (get-in result [:options :type])))
(is (= "import.sqlite" (get-in result [:options :input])))
(is (= "demo" (get-in result [:options :repo])))))
(testing "graph export requires type"
(let [result (commands/parse-args ["graph" "export" "--output" "export.edn"])]
(is (false? (:ok? result)))
(is (= :missing-type (get-in result [:error :code])))))
(testing "graph export requires output"
(let [result (commands/parse-args ["graph" "export" "--type" "edn"])]
(is (false? (:ok? result)))
(is (= :missing-output (get-in result [:error :code])))))
(testing "graph import requires repo"
(let [result (commands/parse-args ["graph" "import"
"--type" "edn"
"--input" "import.edn"])]
(is (false? (:ok? result)))
(is (= :missing-repo (get-in result [:error :code])))))
(testing "graph import rejects unknown type"
(let [result (commands/parse-args ["graph" "import"
"--type" "zip"
"--input" "import.zip"
"--repo" "demo"])]
(is (false? (:ok? result)))
(is (= :invalid-options (get-in result [:error :code])))))
(testing "verb subcommands reject unknown flags"
(doseq [args [["list" "page" "--wat"]
["add" "block" "--wat"]
@@ -361,6 +409,22 @@
(is (true? (:ok? result)))
(is (= :graph-info (get-in result [:action :type])))))
(testing "graph export uses config repo"
(let [parsed {:ok? true
:command :graph-export
:options {:type "edn" :output "export.edn"}}
result (commands/build-action parsed {:repo "demo"})]
(is (true? (:ok? result)))
(is (= :graph-export (get-in result [:action :type])))))
(testing "graph import requires repo"
(let [parsed {:ok? true
:command :graph-import
:options {:type "edn" :input "import.edn"}}
result (commands/build-action parsed {})]
(is (false? (:ok? result)))
(is (= :missing-repo (get-in result [:error :code])))))
(testing "list page requires repo"
(let [parsed {:ok? true :command :list-page :options {}}
result (commands/build-action parsed {})]
@@ -419,3 +483,140 @@
(p/catch (fn [e]
(is false (str "unexpected error: " e))
(done)))))))
(deftest test-execute-graph-import-rejects-existing-graph
(async done
(let [orig-list-graphs cli-server/list-graphs
orig-ensure-server! cli-server/ensure-server!]
(set! cli-server/list-graphs (fn [_] ["demo"]))
(set! cli-server/ensure-server! (fn [_ _]
(throw (ex-info "should not start server" {}))))
(-> (p/let [result (commands/execute {:type :graph-import
:repo "logseq_db_demo"
:allow-missing-graph true}
{})]
(is (= :error (:status result)))
(is (= :graph-exists (get-in result [:error :code]))))
(p/catch (fn [e]
(is false (str "unexpected error: " e))))
(p/finally (fn []
(set! cli-server/list-graphs orig-list-graphs)
(set! cli-server/ensure-server! orig-ensure-server!)
(done)))))))
(deftest test-execute-graph-export
(async done
(let [invoke-calls (atom [])
write-calls (atom [])
orig-list-graphs cli-server/list-graphs
orig-ensure-server! cli-server/ensure-server!
orig-invoke transport/invoke
orig-write-output transport/write-output]
(set! cli-server/list-graphs (fn [_] ["demo"]))
(set! cli-server/ensure-server! (fn [config _]
(assoc config :base-url "http://127.0.0.1:9999")))
(set! transport/invoke (fn [_ method direct-pass? args]
(swap! invoke-calls conj [method direct-pass? args])
(if (= method "thread-api/export-db-base64")
"c3FsaXRl"
{:exported true})))
(set! transport/write-output (fn [opts]
(swap! write-calls conj opts)))
(-> (p/let [edn-result (commands/execute {:type :graph-export
:repo "logseq_db_demo"
:graph "demo"
:export-type "edn"
:output "/tmp/export.edn"
:allow-missing-graph true}
{})
sqlite-result (commands/execute {:type :graph-export
:repo "logseq_db_demo"
:graph "demo"
:export-type "sqlite"
:output "/tmp/export.sqlite"
:allow-missing-graph true}
{})]
(is (= :ok (:status edn-result)))
(is (= :ok (:status sqlite-result)))
(is (= [["thread-api/export-edn" false ["logseq_db_demo" {:export-type :graph}]]
["thread-api/export-db-base64" true ["logseq_db_demo"]]]
@invoke-calls))
(is (= 2 (count @write-calls)))
(let [[edn-write sqlite-write] @write-calls]
(is (= {:format :edn :path "/tmp/export.edn" :data {:exported true}}
edn-write))
(is (= :sqlite (:format sqlite-write)))
(is (= "/tmp/export.sqlite" (:path sqlite-write)))
(is (= "sqlite" (.toString (:data sqlite-write) "utf8")))))
(p/catch (fn [e]
(is false (str "unexpected error: " e))))
(p/finally (fn []
(set! cli-server/list-graphs orig-list-graphs)
(set! cli-server/ensure-server! orig-ensure-server!)
(set! transport/invoke orig-invoke)
(set! transport/write-output orig-write-output)
(done)))))))
(deftest test-execute-graph-import
(async done
(let [invoke-calls (atom [])
read-calls (atom [])
stop-calls (atom [])
restart-calls (atom [])
orig-list-graphs cli-server/list-graphs
orig-stop-server! cli-server/stop-server!
orig-restart-server! cli-server/restart-server!
orig-ensure-server! cli-server/ensure-server!
orig-read-input transport/read-input
orig-invoke transport/invoke]
(set! cli-server/list-graphs (fn [_] []))
(set! cli-server/stop-server! (fn [_ repo]
(swap! stop-calls conj repo)
(p/resolved {:ok? true})))
(set! cli-server/restart-server! (fn [_ repo]
(swap! restart-calls conj repo)
(p/resolved {:ok? true})))
(set! cli-server/ensure-server! (fn [config _]
(assoc config :base-url "http://127.0.0.1:9999")))
(set! transport/read-input (fn [{:keys [format path]}]
(swap! read-calls conj [format path])
(if (= format :edn)
{:page "Import Page"}
(js/Buffer.from "sqlite" "utf8"))))
(set! transport/invoke (fn [_ method _ args]
(swap! invoke-calls conj [method args])
{:ok true}))
(-> (p/let [edn-result (commands/execute {:type :graph-import
:repo "logseq_db_demo"
:graph "demo"
:import-type "edn"
:input "/tmp/import.edn"
:allow-missing-graph true}
{})
sqlite-result (commands/execute {:type :graph-import
:repo "logseq_db_demo"
:graph "demo"
:import-type "sqlite"
:input "/tmp/import.sqlite"
:allow-missing-graph true}
{})]
(is (= :ok (:status edn-result)))
(is (= :ok (:status sqlite-result)))
(is (= [[:edn "/tmp/import.edn"]
[:sqlite "/tmp/import.sqlite"]]
@read-calls))
(is (= [["thread-api/import-edn" ["logseq_db_demo" {:page "Import Page"}]]
["thread-api/import-db-base64" ["logseq_db_demo" "c3FsaXRl"]]]
@invoke-calls))
(is (= ["logseq_db_demo" "logseq_db_demo"] @stop-calls))
(is (= ["logseq_db_demo" "logseq_db_demo"] @restart-calls)))
(p/catch (fn [e]
(is false (str "unexpected error: " e))))
(p/finally (fn []
(set! cli-server/list-graphs orig-list-graphs)
(set! cli-server/stop-server! orig-stop-server!)
(set! cli-server/restart-server! orig-restart-server!)
(set! cli-server/ensure-server! orig-ensure-server!)
(set! transport/read-input orig-read-input)
(set! transport/invoke orig-invoke)
(done)))))))

View File

@@ -105,6 +105,23 @@
{:output-format nil})]
(is (= "Removed page: Home (repo: demo-repo)" result)))))
(deftest test-human-output-graph-import-export
(testing "graph export renders a succinct success line"
(let [result (format/format-result {:status :ok
:command :graph-export
:context {:export-type "edn"
:output "/tmp/export.edn"}}
{:output-format nil})]
(is (= "Exported edn to /tmp/export.edn" result))))
(testing "graph import renders a succinct success line"
(let [result (format/format-result {:status :ok
:command :graph-import
:context {:import-type "sqlite"
:input "/tmp/import.sqlite"}}
{:output-format nil})]
(is (= "Imported sqlite from /tmp/import.sqlite" result)))))
(deftest test-human-output-graph-info
(testing "graph info includes key metadata lines"
(let [result (format/format-result {:status :ok

View File

@@ -73,21 +73,21 @@
(-> (p/let [cfg-path (node-path/join (node-helper/create-tmp-dir "cli") "cli.edn")
_ (fs/writeFileSync cfg-path "{:output-format :json}")
_ (run-cli ["graph" "create" "--repo" "content-graph"] data-dir cfg-path)
add-page-result (run-cli ["add" "page" "--page" "TestPage"] data-dir cfg-path)
add-page-result (run-cli ["--repo" "content-graph" "add" "page" "--page" "TestPage"] data-dir cfg-path)
add-page-payload (parse-json-output add-page-result)
list-page-result (run-cli ["list" "page"] data-dir cfg-path)
list-page-result (run-cli ["--repo" "content-graph" "list" "page"] data-dir cfg-path)
list-page-payload (parse-json-output list-page-result)
list-tag-result (run-cli ["list" "tag"] data-dir cfg-path)
list-tag-result (run-cli ["--repo" "content-graph" "list" "tag"] data-dir cfg-path)
list-tag-payload (parse-json-output list-tag-result)
list-property-result (run-cli ["list" "property"] data-dir cfg-path)
list-property-result (run-cli ["--repo" "content-graph" "list" "property"] data-dir cfg-path)
list-property-payload (parse-json-output list-property-result)
add-block-result (run-cli ["add" "block" "--page" "TestPage" "--content" "hello world"] data-dir cfg-path)
add-block-result (run-cli ["--repo" "content-graph" "add" "block" "--page" "TestPage" "--content" "hello world"] data-dir cfg-path)
_ (parse-json-output add-block-result)
search-result (run-cli ["search" "--text" "hello world" "--include-content"] data-dir cfg-path)
search-result (run-cli ["--repo" "content-graph" "search" "--text" "hello world"] data-dir cfg-path)
search-payload (parse-json-output search-result)
show-result (run-cli ["show" "--page-name" "TestPage" "--format" "json"] data-dir cfg-path)
show-result (run-cli ["--repo" "content-graph" "show" "--page-name" "TestPage" "--format" "json"] data-dir cfg-path)
show-payload (parse-json-output show-result)
remove-page-result (run-cli ["remove" "page" "--page" "TestPage"] data-dir cfg-path)
remove-page-result (run-cli ["--repo" "content-graph" "remove" "page" "--page" "TestPage"] data-dir cfg-path)
remove-page-payload (parse-json-output remove-page-result)
stop-result (run-cli ["server" "stop" "--repo" "content-graph"] data-dir cfg-path)
stop-payload (parse-json-output stop-result)]
@@ -210,3 +210,81 @@
(p/catch (fn [e]
(is false (str "unexpected error: " e))
(done)))))))
(deftest test-cli-graph-export-import-edn
(async done
(let [data-dir (node-helper/create-tmp-dir "db-worker-export-edn")]
(-> (p/let [cfg-path (node-path/join (node-helper/create-tmp-dir "cli") "cli.edn")
_ (fs/writeFileSync cfg-path "{:output-format :json}")
export-graph "export-edn-graph"
import-graph "import-edn-graph"
export-path (node-path/join (node-helper/create-tmp-dir "exports") "graph.edn")
_ (run-cli ["graph" "create" "--repo" export-graph] data-dir cfg-path)
_ (run-cli ["--repo" export-graph "add" "page" "--page" "ExportPage"] data-dir cfg-path)
_ (run-cli ["--repo" export-graph "add" "block" "--page" "ExportPage" "--content" "Export content"] data-dir cfg-path)
export-result (run-cli ["--repo" export-graph
"graph" "export"
"--type" "edn"
"--output" export-path] data-dir cfg-path)
export-payload (parse-json-output export-result)
_ (run-cli ["--repo" import-graph
"graph" "import"
"--type" "edn"
"--input" export-path] data-dir cfg-path)
list-result (run-cli ["--repo" import-graph "list" "page"] data-dir cfg-path)
list-payload (parse-json-output list-result)
stop-export (run-cli ["server" "stop" "--repo" export-graph] data-dir cfg-path)
stop-import (run-cli ["server" "stop" "--repo" import-graph] data-dir cfg-path)]
(is (= 0 (:exit-code export-result)))
(is (= "ok" (:status export-payload)))
(is (fs/existsSync export-path))
(is (pos? (.-size (fs/statSync export-path))))
(is (= "ok" (:status list-payload)))
(is (some (fn [item]
(= "ExportPage" (or (:title item) (:block/title item))))
(get-in list-payload [:data :items])))
(is (= 0 (:exit-code stop-export)))
(is (= 0 (:exit-code stop-import)))
(done))
(p/catch (fn [e]
(is false (str "unexpected error: " e))
(done)))))))
(deftest test-cli-graph-export-import-sqlite
(async done
(let [data-dir (node-helper/create-tmp-dir "db-worker-export-sqlite")]
(-> (p/let [cfg-path (node-path/join (node-helper/create-tmp-dir "cli") "cli.edn")
_ (fs/writeFileSync cfg-path "{:output-format :json}")
export-graph "export-sqlite-graph"
import-graph "import-sqlite-graph"
export-path (node-path/join (node-helper/create-tmp-dir "exports") "graph.sqlite")
_ (run-cli ["graph" "create" "--repo" export-graph] data-dir cfg-path)
_ (run-cli ["--repo" export-graph "add" "page" "--page" "SQLiteExportPage"] data-dir cfg-path)
_ (run-cli ["--repo" export-graph "add" "block" "--page" "SQLiteExportPage" "--content" "SQLite export content"] data-dir cfg-path)
export-result (run-cli ["--repo" export-graph
"graph" "export"
"--type" "sqlite"
"--output" export-path] data-dir cfg-path)
export-payload (parse-json-output export-result)
_ (run-cli ["--repo" import-graph
"graph" "import"
"--type" "sqlite"
"--input" export-path] data-dir cfg-path)
list-result (run-cli ["--repo" import-graph "list" "page"] data-dir cfg-path)
list-payload (parse-json-output list-result)
stop-export (run-cli ["server" "stop" "--repo" export-graph] data-dir cfg-path)
stop-import (run-cli ["server" "stop" "--repo" import-graph] data-dir cfg-path)]
(is (= 0 (:exit-code export-result)))
(is (= "ok" (:status export-payload)))
(is (fs/existsSync export-path))
(is (pos? (.-size (fs/statSync export-path))))
(is (= "ok" (:status list-payload)))
(is (some (fn [item]
(= "SQLiteExportPage" (or (:title item) (:block/title item))))
(get-in list-payload [:data :items])))
(is (= 0 (:exit-code stop-export)))
(is (= 0 (:exit-code stop-import)))
(done))
(p/catch (fn [e]
(is false (str "unexpected error: " e))
(done)))))))

View File

@@ -1,8 +1,17 @@
(ns logseq.cli.transport-test
(:require [cljs.test :refer [deftest is async]]
(:require [cljs.test :refer [deftest is async testing]]
[promesa.core :as p]
[logseq.cli.transport :as transport]))
(def ^:private fs (js/require "fs"))
(def ^:private os (js/require "os"))
(def ^:private path (js/require "path"))
(defn- temp-path
[filename]
(let [dir (.mkdtempSync fs (.join path (.tmpdir os) "logseq-cli-"))]
(.join path dir filename)))
(defn- start-server
[handler]
(p/create
@@ -62,3 +71,24 @@
(p/catch (fn [e]
(is false (str "unexpected error: " e))
(done))))))
(deftest test-read-input
(testing "reads edn input"
(let [path (temp-path "input.edn")]
(.writeFileSync fs path "{:a 1}")
(is (= {:a 1} (transport/read-input {:format :edn :path path})))))
(testing "reads sqlite input as buffer"
(let [path (temp-path "input.sqlite")
buffer (js/Buffer.from "sqlite-data")]
(.writeFileSync fs path buffer)
(let [result (transport/read-input {:format :sqlite :path path})]
(is (instance? js/Buffer result))
(is (= "sqlite-data" (.toString result "utf8")))))))
(deftest test-write-output
(testing "writes sqlite output as buffer"
(let [path (temp-path "output.sqlite")
buffer (js/Buffer.from "sqlite-export")]
(transport/write-output {:format :sqlite :path path :data buffer})
(is (= "sqlite-export" (.toString (.readFileSync fs path) "utf8"))))))