v1.88.7
Whats Changed
You can now drive the browser with multi steps towards a goal, example. Extracting content or gathering extra data can be done as well using GPTConfigs.extra_ai_data
.
The credits used can be checked with Page.openai_credits_used
.
- chore(page): return all page content regardless of status
- chore(openai): fix svg removal
- feat(openai): add extra data gpt curating
- chore(openai): add credits used response
- feat(fingerprint): add fingerprint id configuration
use spider::configuration::{GPTConfigs, WaitForIdleNetwork};
use spider::website::Website;
#[tokio::main]
async fn main() {
let gpt_config: GPTConfigs = GPTConfigs::new_multi(
"gpt-4-1106-preview",
vec![
"Search for Movies",
"Click on the first result movie result",
],
500,
);
let mut website: Website = Website::new("https://www.google.com")
.with_openai(Some(gpt_config))
.with_limit(1)
.build()
.unwrap();
website.crawl().await;
}
Full Changelog: v1.87.3...v1.88.7