Skip to content

Commit bfd544e

Browse files
Support gpt-3.5-turbo (#23)
Co-authored-by: Fredrick R. Brennan <copypaste@kittens.ph>
1 parent 20eaff0 commit bfd544e

6 files changed

Lines changed: 245 additions & 64 deletions

File tree

README.md

Lines changed: 9 additions & 33 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
<h1 align="center"><code>ata</code>: Ask the Terminal Anything</h1>
22

3-
<h3 align="center">OpenAI GPT in the terminal</h3>
3+
<h3 align="center">ChatGPT in the terminal</h3>
44

55
<p align="center">
66
<a href="https://asciinema.org/a/557270"><img src="https://asciinema.org/a/557270.svg" alt="asciicast"></a>
@@ -12,8 +12,6 @@ TIP:<br>
1212
This can be done via: Iterm2 (Mac), Guake (Ubuntu), scratchpad (i3/sway), or the quake mode for the Windows Terminal.
1313
</h3>
1414

15-
At the time of writing, use `text-davinci-003`. Davinci was released together with ChatGPT as part of the [GPT-3.5 series](https://platform.openai.com/docs/model-index-for-researchers/models-referred-to-as-gpt-3-5) and they are very comparable in terms of capabilities; ChatGPT is more verbose.
16-
1715
## Productivity benefits
1816

1917
- The terminal starts more quickly and requires **less resources** than a browser.
@@ -26,35 +24,11 @@ At the time of writing, use `text-davinci-003`. Davinci was released together wi
2624
Download the binary for your system from [Releases](https://github.com/rikhuijzer/ata/releases).
2725
If you're running Arch Linux, then you can use the AUR packages: [ata](https://aur.archlinux.org/packages/ata), [ata-git](https://aur.archlinux.org/packages/ata-git), or [ata-bin](https://aur.archlinux.org/packages/ata-bin).
2826

29-
Request an API key via <https://beta.openai.com/account/api-keys>.
30-
Next, set the API key, the model that you want to use, and the maximum amount of tokens that the server can respond with in `ata.toml`:
31-
32-
```toml
33-
api_key = "<YOUR SECRET API KEY>"
34-
model = "text-davinci-003"
35-
max_tokens = 500
36-
temperature = 0.8
37-
```
38-
39-
Here, replace `<YOUR SECRET API KEY>` with your API key, which you can request via https://beta.openai.com/account/api-keys.
40-
41-
The `max_tokens` sets the maximum amount of tokens that the server will answer with.
42-
43-
The `temperature` sets the `sampling temperature`. From the OpenAI API docs: "What sampling temperature to use. Higher values means the model will take more risks. Try 0.9 for more creative applications, and 0 (argmax sampling) for ones with a well-defined answer." According to Stephen Wolfram [[1]], setting it to a higher value such as 0.8 will likely work best in practice.
44-
45-
[1]: https://writings.stephenwolfram.com/2023/02/what-is-chatgpt-doing-and-why-does-it-work/
27+
To specify the API key and some basic model settings, start the application.
28+
It should give an error and the option to create a configuration file called `ata.toml` for you.
29+
Press `y` and `ENTER` to create a `ata.toml` file.
4630

47-
Next, run:
48-
49-
```sh
50-
$ ata --config=ata.toml
51-
```
52-
53-
Or, change the current directory to the one where `ata.toml` is located and run
54-
55-
```sh
56-
$ ata
57-
```
31+
Next, request an API key via <https://beta.openai.com/account/api-keys> and update the key in the example configuration file.
5832

5933
For more information, see:
6034

@@ -66,8 +40,10 @@ $ ata --help
6640

6741
**How much will I have to pay for the API?**
6842

69-
Using OpenAI's API is quite cheap, I have been using this terminal application heavily for a few weeks now and my costs are about $0.15 per day ($4.50 per month).
70-
The first $18.00 for free, so you can use it for about 120 days (4 months) before having to pay.
43+
Using OpenAI's API for chat is very cheap.
44+
Let's say that an average response is about 500 tokens, so costs $0.001.
45+
That means that if you do 100 requests per day, then that will cost you about $0.10.
46+
OpenAI grants you $18.00 for free, so you can use the API for about 180 days (6 months) before having to pay.
7147

7248
**Can I build the binary myself?**
7349

ata/Cargo.toml

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,21 @@
11
[package]
22
name = "ata"
3-
version = "1.0.3"
3+
version = "2.0.0"
44
edition = "2021"
55
authors = ["Rik Huijzer <t.h.huijzer@rug.nl>", "Fredrick R. Brennan <copypaste@kittens.ph>"]
66
license = "MIT"
77

88
[dependencies]
9-
rustyline = "10.1"
9+
clap = { version = "4.1.4", features = ["derive"] }
10+
directories = "4.0"
1011
hyper = { version = "0.14", features = ["full"] }
1112
hyper-rustls = { version = "0.23" }
13+
os_str_bytes = "6.4.1"
14+
rustyline = "10.1"
1215
serde = { version = "1", features = ["derive"] }
1316
serde_json = { version = "1" }
1417
tokio = { version = "1", features = ["full"] }
1518
toml = { version = "0.6" }
16-
clap = { version = "4.1.4", features = ["derive"] }
1719

1820
[dev-dependencies]
1921
pretty_assertions = "1"

ata/src/config.rs

Lines changed: 114 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,114 @@
1+
use directories::ProjectDirs;
2+
use os_str_bytes::OsStrBytes as _;
3+
use os_str_bytes::OsStringBytes as _;
4+
use serde::Deserialize;
5+
use std::convert::Infallible;
6+
use std::ffi::OsString;
7+
use std::path::Path;
8+
use std::path::PathBuf;
9+
use std::str::FromStr;
10+
use toml::de::Error as TomlError;
11+
12+
#[derive(Clone, Deserialize, Debug)]
13+
pub struct Config {
14+
pub api_key: String,
15+
pub model: String,
16+
pub max_tokens: i64,
17+
pub temperature: f64
18+
}
19+
20+
#[derive(Clone, Deserialize, Debug, Default)]
21+
pub enum ConfigLocation {
22+
#[default]
23+
Auto,
24+
Path(PathBuf),
25+
Named(PathBuf),
26+
}
27+
28+
impl FromStr for ConfigLocation {
29+
type Err = Infallible;
30+
31+
fn from_str(s: &str) -> Result<Self, Self::Err> {
32+
Ok(if !s.contains('.') && !s.is_empty() {
33+
Self::Named(s.into())
34+
} else if !s.is_empty() {
35+
Self::Path(s.into())
36+
} else if s.trim().is_empty() {
37+
Self::Auto
38+
} else {
39+
unreachable!()
40+
})
41+
}
42+
}
43+
44+
impl<S> From<S> for ConfigLocation where S: AsRef<str> {
45+
fn from(s: S) -> Self {
46+
Self::from_str(s.as_ref()).unwrap()
47+
}
48+
}
49+
50+
fn get_config_dir() -> PathBuf {
51+
ProjectDirs::from(
52+
"ata",
53+
"Ask the Terminal Anything (ATA) Project Authors",
54+
"ata",
55+
).unwrap().config_dir().into()
56+
}
57+
58+
pub fn default_path(name: Option<&Path>) -> PathBuf {
59+
let mut config_file = get_config_dir();
60+
let file: Vec<_> = if let Some(name) = name {
61+
let mut name = name.to_path_buf();
62+
name.set_extension("toml");
63+
name.as_os_str()
64+
.to_raw_bytes()
65+
.iter()
66+
.copied()
67+
.collect()
68+
} else {
69+
"ata.toml".bytes().collect()
70+
};
71+
let file = OsString::assert_from_raw_vec(file);
72+
config_file.push(&file);
73+
config_file
74+
}
75+
76+
impl ConfigLocation {
77+
pub fn location(&self) -> PathBuf {
78+
match self {
79+
ConfigLocation::Auto => {
80+
if self.location().exists() {
81+
return self.location();
82+
}
83+
default_path(None)
84+
}
85+
ConfigLocation::Path(pb) => {
86+
pb.clone()
87+
},
88+
ConfigLocation::Named(name) => {
89+
if name.as_os_str() == "default" {
90+
return match Path::new("ata.toml").exists() {
91+
true => Path::new(&"ata.toml").into(),
92+
false => default_path(None)
93+
};
94+
}
95+
default_path(Some(name))
96+
}
97+
}
98+
}
99+
}
100+
101+
impl FromStr for Config {
102+
type Err = TomlError;
103+
104+
fn from_str(contents: &str) -> Result<Self, Self::Err> {
105+
toml::from_str(contents)
106+
}
107+
}
108+
109+
impl<S> From<S> for Config where S: AsRef<str> {
110+
fn from(s: S) -> Self {
111+
Self::from_str(s.as_ref())
112+
.unwrap_or_else(|e| panic!("Config parsing failure!: {:?}", e))
113+
}
114+
}

ata/src/help.rs

Lines changed: 46 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,9 @@
1+
use crate::config;
2+
use rustyline::Editor;
3+
use std::fs::File;
4+
use std::fs;
5+
use std::io::Write as _;
6+
17
pub fn commands() {
28
println!("
39
Ctrl-A, Home Move cursor to the beginning of line
@@ -29,29 +35,61 @@ Thanks to <https://github.com/kkawakam/rustyline#emacs-mode-default-mode>.
2935
");
3036
}
3137

38+
const EXAMPLE_TOML: &str = r#"api_key = "<YOUR SECRET API KEY>"
39+
model = "gpt-3.5-turbo"
40+
max_tokens = 1000
41+
temperature = 0.8"#;
42+
3243
pub fn missing_toml(args: Vec<String>) {
44+
let default_path = config::default_path(None);
3345
eprintln!(
3446
r#"
35-
Could not find the file `ata.toml`. To fix this, use `{} --config=<Path to ata.toml>` or have `ata.toml` in the current dir.
47+
Could not find a configuration file.
3648
37-
For example, make a new file `ata.toml` in the current directory with the following content (the text between the ```):
49+
To fix this, use `{} --config=<Path to ata.toml>` or create `{1}`. For the last option, type `y` to write the following example file:
3850
3951
```
40-
api_key = "<YOUR SECRET API KEY>"
41-
model = "text-davinci-003"
42-
max_tokens = 500
43-
temperature = 0.8
52+
{EXAMPLE_TOML}
4453
```
4554
46-
Here, replace `<YOUR SECRET API KEY>` with your API key, which you can request via https://beta.openai.com/account/api-keys.
55+
Next, replace `<YOUR SECRET API KEY>` with your API key, which you can request via https://beta.openai.com/account/api-keys.
4756
4857
The `max_tokens` sets the maximum amount of tokens that the server will answer with.
4958
5059
The `temperature` sets the `sampling temperature`. From the OpenAI API docs: "What sampling temperature to use. Higher values means the model will take more risks. Try 0.9 for more creative applications, and 0 (argmax sampling) for ones with a well-defined answer." According to Stephen Wolfram [1], setting it to a higher value such as 0.8 will likely work best in practice.
5160
5261
[1]: https://writings.stephenwolfram.com/2023/02/what-is-chatgpt-doing-and-why-does-it-work/
62+
"#,
63+
args[0],
64+
default_path.display()
65+
);
66+
67+
let mut rl = Editor::<()>::new().unwrap();
68+
let msg = format!(
69+
"\x1b[1mDo you want me to write this example file to {0:?} for you to edit? [y/N]\x1b[0m",
70+
default_path
71+
);
72+
let readline = rl.readline(&msg);
73+
if let Ok(msg) = readline {
74+
let response: bool = msg
75+
.trim()
76+
.chars()
77+
.next()
78+
.map(|c| c.to_lowercase().collect::<String>() == "y")
79+
.unwrap_or(false);
80+
if response {
81+
if !default_path.exists() && !default_path.parent().unwrap().is_dir() {
82+
let dir = default_path.parent().unwrap();
83+
fs::create_dir_all(dir).expect("Could not make configuration directory");
84+
}
85+
let mut f = File::create(&default_path).expect("Unable to create file");
86+
f.write_all(EXAMPLE_TOML.as_bytes())
87+
.expect("Unable to write to file");
88+
println!();
89+
println!("Wrote to {default_path:?}.");
90+
}
91+
}
5392

54-
"#, args[0]);
5593
std::process::exit(1);
5694
}
5795

0 commit comments

Comments
 (0)