Skip to content
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
85 changes: 84 additions & 1 deletion Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 2 additions & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,8 @@ rand = "0.8"
sha2 = "0.10"
tiny_http = "0.12"
comfy-table = "7"
Comment thread
pthurlow marked this conversation as resolved.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The nix crate is only used inside a #[cfg(target_os = "macos")] block (stdin_redirect_filename), but it's declared as an unconditional dependency. This means it compiles and links on Linux and Windows even though it's never called there.

Consider a platform-scoped dependency:

Suggested change
comfy-table = "7"
nix = { version = "0.29", features = ["fs"], optional = true }

and activate it only for macOS in [target.'cfg(target_os = "macos")'.dependencies], or use a target-specific dependency table directly:

[target.'cfg(target_os = "macos")'.dependencies]
nix = { version = "0.29", features = ["fs"] }

indicatif = "0.17"
nix = { version = "0.29", features = ["fs"] }
flate2 = "1"
tar = "0.4"
semver = "1"
Expand Down
112 changes: 25 additions & 87 deletions src/command.rs
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,19 @@ pub enum Commands {

/// Manage datasets
Datasets {
/// Dataset ID to show details
id: Option<String>,

/// Workspace ID (defaults to first workspace from login)
#[arg(long)]
workspace_id: Option<String>,

/// Output format (used with dataset ID)
#[arg(long, default_value = "table", value_parser = ["table", "json", "yaml"])]
format: String,

#[command(subcommand)]
command: DatasetsCommands,
command: Option<DatasetsCommands>,
},

/// Execute a SQL query
Expand Down Expand Up @@ -159,111 +170,38 @@ pub enum DatasetsCommands {
#[arg(long)]
workspace_id: Option<String>,

/// Output format
#[arg(long, default_value = "yaml", value_parser = ["table", "json", "yaml"])]
format: String,
},

/// Get details for a specific dataset
Get {
/// Workspace ID (defaults to first workspace from login)
#[arg(long)]
workspace_id: Option<String>,

/// Dataset ID
dataset_id: String,

/// Output format
#[arg(long, default_value = "yaml", value_parser = ["table", "json", "yaml"])]
format: String,
},

/// Create a new dataset in a workspace
Create {
/// Workspace ID (defaults to first workspace from login)
#[arg(long)]
workspace_id: Option<String>,

/// Dataset name
#[arg(long)]
name: String,

/// SQL query for the dataset
/// Maximum number of results (default: 100, max: 1000)
#[arg(long)]
sql: Option<String>,
limit: Option<u32>,

/// Connection ID for the dataset
/// Pagination offset
#[arg(long)]
connection_id: Option<String>,
offset: Option<u32>,

/// Output format
#[arg(long, default_value = "yaml", value_parser = ["table", "json", "yaml"])]
#[arg(long, default_value = "table", value_parser = ["table", "json", "yaml"])]
format: String,
},

/// Update a dataset in a workspace
Update {
/// Create a new dataset from a file or piped stdin
Create {
/// Workspace ID (defaults to first workspace from login)
#[arg(long)]
workspace_id: Option<String>,

/// Dataset ID
dataset_id: String,

/// New dataset name
/// Dataset label (derived from filename if omitted)
#[arg(long)]
name: Option<String>,
label: Option<String>,

/// New SQL query for the dataset
/// Table name (derived from label if omitted)
#[arg(long)]
query: Option<String>,

/// Output format
#[arg(long, default_value = "yaml", value_parser = ["table", "json", "yaml"])]
format: String,
},
table_name: Option<String>,

/// Delete a dataset from a workspace
Delete {
/// Workspace ID (defaults to first workspace from login)
/// Path to a file to upload (omit to read from stdin)
#[arg(long)]
workspace_id: Option<String>,

/// Dataset ID
dataset_id: String,
file: Option<String>,
},

/// Update the SQL query for a dataset
UpdateSql {
/// Workspace ID (defaults to first workspace from login)
#[arg(long)]
workspace_id: Option<String>,

/// Dataset ID
dataset_id: String,

/// New SQL query for the dataset
#[arg(long)]
sql: String,

/// Output format
#[arg(long, default_value = "yaml", value_parser = ["table", "json", "yaml"])]
format: String,
},

/// Execute a dataset
Execute {
/// Workspace ID (defaults to first workspace from login)
#[arg(long)]
workspace_id: Option<String>,

/// Dataset ID
dataset_id: String,

/// Output format
#[arg(long, default_value = "yaml", value_parser = ["table", "json", "yaml"])]
format: String,
},
}


Expand Down
Loading
Loading