AI-driven app with smart navigation, interactive chatbot, insightful videos, and personalized assessments.
Report Bug
·
Request Feature
The Learning App is designed to provide a personalized educational experience. Users can input links to resources they want to learn from, and the LLM will update its knowledge base accordingly. The app offers various features, including reading generated pages, learning through video lectures, taking tests, and chatting with the AI.
So as to run our project locally, you need to follow the steps below.
-
Clone the repository:
https://github.com/Sar2580P/cleverchat.git cd cleverchat
-
Install dependencies for the frontend (Next.js):
cd web npm install
-
Install dependencies for the backend (Django):
cd ../api python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt
-
Run the development servers:
- For Next.js (frontend):
cd web npm run build npm start
- For Django (backend):
cd ../api python manage.py runserver
- For Next.js (frontend):
-
Open your browser and navigate to
http://localhost:3000
for the frontend andhttp://127.0.0.1:8000/api
for the backend admin interface.
The Learning App is designed to provide a personalized educational experience. Users can input links to resources they want to learn from, and the LLM will update its knowledge base accordingly. The app offers various features, including reading generated pages, learning through video lectures, taking tests, and chatting with the AI.
-
Un-Interactive Bots: Many service bots today only respond in plain text, which can make interactions feel less engaging.
-
Upper-Bound on Agent Tools: Having too many tools available can overwhelm agents and potentially degrade their performance.
-
Reading is Tough: Users often prefer engaging content formats, such as videos or interactive media, over lengthy text-based content like blogs.
-
Maintenance of Tools: Agents are provided with a static list of tools for each query, which can lead to inefficiencies if the tools are not updated or optimized based on the query's context.
-
Increasing Context Length Unnecessarily: Expanding the context length without necessity can negatively impact the system's performance and efficiency.
- Each query is a directed acyclic graph (DAG) in itself.
- Certain parts of the query contribute to answering other parts.
- This idea allows segmenting the query into directed sub-tasks.
- Solve and get the final response for each task topologically, enriching each segment with metadata like associated images, web-links, etc.
- Serve the final response as a concatenation of enriched responses from each node in the graph.
- Each agent has a set of agents and tools available to it, similar to how a prime minister has a set of ministers under them.
- This hierarchical structure helps manage tasks efficiently.
- Each agent has a module for creating a directed acyclic graph.
- The DAG module establishes the relationship between the input/output of tools and agents.
- After exploring connectivity using the DAG module, each tool is run in a topological manner.
- The output of a parent tool node is available to children as context in their prompt.
- Each node in the graph is either a tool or another agent itself.
- Each node handles contrastive parts of the original query and returns a response along with metadata (image-links, etc.).
- Just provide the sources to build knowledge from, such as PDFs or web-links.
- Performs chunking and embedding on documents.
- Applies PCA on embeddings and clusters the reduced embeddings.
- Orders clusters to maintain the flow of thoughts.
- Provides a video lecture to assist in understanding concepts.
- Provides an objective quiz to test understanding of the concept.
- Home Page: Add links to update the LLM's knowledge base.
- Converse AI: Access new pages generated by the LLM from the provided links.
- Insight AI: Learn through video lectures.
- Evaluate AI: Take tests generated by the LLM.
- Chat AI: Chat with the AI based on updated knowledge and general information.
The starting point of the application where users can input links to resources. These links will be used to update the LLM's knowledge base, ensuring that the AI is up-to-date with the latest information provided by the user.
A section where users can read new pages generated by the LLM from the links provided on the Home Page. This allows users to stay updated with the latest information and insights derived from their specified sources.
This page features video lectures that facilitate learning. Users can watch educational videos tailored to the topics they've provided, making learning more interactive and engaging.
Users can take tests generated by the LLM on this page. The tests are designed to evaluate the user's understanding of the material provided in the links, helping users to assess their knowledge and progress.
A chat interface where users can interact with the AI. The AI uses the updated knowledge base to answer questions and provide information, making it a valuable tool for learning and general inquiries.
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
Use this space to list resources you find helpful and would like to give credit to. I've included a few of my favorites to kick things off!
A special thank you to all the current contributors who have made this project possible. You can view the contributors