CurioQuest is an AI-powered question-answering application that leverages Bing Search for context and Cohere AI for generating detailed, citation-based responses. Designed for knowledge seekers, CurioQuest provides accurate, source-cited answers to complex questions in real-time.
CurioQuest combines real-time Bing Search with Cohere AI to deliver reliable, source-cited answers, reducing misinformation and enhancing the accuracy of every response. It’s designed not just for curiosity-driven exploration, but also as a showcase of cutting-edge tech, all while aiming to build something cool enough to get noticed—and maybe even hired!
- Provide source-cited answers: Offer verifiable answers to user queries with citations to relevant sources.
- Real-time information retrieval: Use Bing Search API for up-to-date data on the topic.
- Engaging and dynamic interactions: Incorporate AI-driven responses for a more interactive and informative experience.
- Intuitive UI/UX: Designed with a clean, minimalistic interface that keeps the focus on content, enabling users to easily ask questions, view responses, and explore citations without distraction.
- Node.js (v14+)
- npm or yarn package manager
- Bing API Key and Cohere API Key
- Clone the repository:
git clone https://github.com/username/curioquest.git cd curioquest
- Install dependencies:
npm install
- Configure environment variables:
- Create a
.env
file in the root directory:BING_API_KEY=your_bing_api_key COHERE_API_KEY=your_cohere_api_key
- Create a
- Start the development server:
npm run dev
- Visit
http://localhost:3000
in your browser to use the application.
- Set up a Vercel account and link the project repository.
- Configure the environment variables in Vercel:
BING_API_KEY
COHERE_API_KEY
- Deploy the application by following the Vercel dashboard prompts.
- Access the deployed application via the generated Vercel URL.
- Query Input: Users can type in any question to initiate a search.
- Follow-Up Questions: After receiving an initial response, users can ask follow-up questions to refine the context or request additional information.
- Source-Cited Answers: Every response includes references to sources, which can be viewed by hovering over the citations.
- Ask a Question:
- User: "Who is Elon Musk?"
- CurioQuest: Returns a detailed answer with citations from Bing Search.
- Follow-Up Questions:
- User: "What are his companies?"
- CurioQuest: Uses the previous context to provide an updated answer with sources.
When selecting a search API, we evaluated major options like Google, X, and Bing. After weighing our choices, Bing emerged as the clear winner. Here’s why:
- Trustworthy Source: Platforms like Perplexity also rely on Bing, indicating it’s a trusted source for accurate, structured, and real-time data.
- Seamless Integration: Bing’s API integrates smoothly, enabling us to deliver fast, reliable answers without technical bottlenecks.
- Reliable and Up-to-Date: Bing excels at providing the latest information, aligning perfectly with our goal of offering timely, verifiable responses.
For those interested in our research:
- Fireside Chat with Aravind Srinivas, CEO of Perplexity AI, & Matt Turck, Partner at FirstMark
- Reddit on Perplexity’s Backend
The decision to use Cohere AI for generating responses was driven by both practicality and budget optimization. While premium models like GPT were appealing, Cohere offered a cost-effective solution that still delivers high-quality, contextually accurate answers.
Maybe someday, if I land a role with a bigger budget, we can go “broke-free” and integrate the premium models—but until then, Cohere is a solid choice! 😅
The UI is designed with a minimalist layout to enhance content focus, and citation bubbles offer quick access to sources for easy reference.
- Time Crunch and UI Focus: With limited time, Next.js was ideal for building a sleek, responsive UI that offers seamless functionality. Its SSR and file-based routing simplified a lot of front-end work.
- Perfect Fit for MVP: As an MVP with minimal backend requirements, Next.js made sense as a front-end-first framework that’s easy to scale if needed.
- New Territory: Initially, I was a bit skeptical since I hadn’t worked with Next.js extensively. However, its developer-friendly setup and extensive ecosystem made it manageable and a solid choice for tight deadlines and feature-rich UIs.
- Due to Next.js function call limitations on the free tier, we opted to separate the context-gathering and AI generation functionalities into distinct APIs. This approach allowed us to manage call frequency more efficiently while staying within free-tier constraints. By isolating these services, we could ensure smoother performance and flexibility, especially as demands for either context or AI-driven responses evolve with project scaling.
For more Info feel free 📘 Visit Notion Doc