Our AI powered chatbots use your chosen ingested data to answer questions tailored to your specific information. These chatbots are powered by a variety of large language models including various ChatGPT and Claude models. As a chatbot owner, you will have the ability to choose to either upload documents to a provided google drive or provide an existing google drive to the AI team. Valid file types include google docs, google sheets, word documents, excel files, PowerPoints, csv files, PDFs, and txt files (popular for video transcripts). Our chatbots also have the ability to scrape websites and ingest data in that manner (this is currently not enabled for websites behind CAS authentication). Our chatbots use RAG (Retrieval-Augmented Generation) to generate responses from the data in your database. When requesting a chatbot there will be several different options that the user can choose from based on their use case. Here is the link to the UMD Virtual Agent Provisioning Request form.
Once a chatbot is provisioned, the chatbot administrators will have access to the Admin Console. Here they will be able to see questions that their chatbot has been asked as well as review the data that the chatbot has ingested. There is also a metrics page to gauge usage of the chatbot as well as a breakdown of the quantity of questions asked based on topic. There is also a configuration page where the admin can further customize the chatbot.
There are currently four different types of chatbots, Academic, Departmental, Personal, and Concept. These different bot types cater to the various use cases determined throughout campus.
These chatbots can be used for departments within the University of Maryland. They can either be public facing or internal depending on the needs of the department. It is also possible to create chatbots for smaller divisions or teams within a department based on need. The chatbot administrators can choose which documents the chatbot will ingest and if there is a relevant website to scrape for information. Examples of our existing departmental bots include Student Financial Services and Cashiering (public facing) and Division of Information Technology (Internal).
Personal chatbots are to be used by an individual. Anyone can request a personal chatbot and populate their database with any information that they would like to query. As these chatbots utilize RAG this will be a different experience than using ChatGPT or Claude directly on their website as this LLM will have access to your specific data.
Concept chatbots are the team's method of allowing for exploratory use cases. These chatbots can be used to expand the current bounds of the chatbot. If you are curious as to whether or not your use case would make for a great concept bot please contact the AI team at ese-sws@umd.edu.
Academic chatbots are used for academic courses and have a variety of specializations based on the requests of professors during our trial periods. Chatbots are currently being utilized for classes anywhere between 100-700 levels. Professors are encouraged to upload any documents that may be used for the class to the google drive folder including transcripts from video lectures. Video transcripts are chunked into 1-minute sections which allow students to easily find the exact portion of a lecture that pertains to their question. If a piece of data was used for the chatbot to formulate the answer provided then there will be in-line citations throughout the answer to provide the student with more information if need be.
Exam Shut off - We have the ability to shut off the chatbot during any specific period of time. This is configured by the chatbot administrator and can be changed in the admin console at any time. During this period, the chatbot will offer a canned response that is configurable per chatbot.
Avoidance of Assessment Material - Documents can be uploaded in an Assessments folder if the chatbot administrator would like to have certain questions not answered by the chatbot. Both an exact search and similarity search are used to determine if the user’s question is too closely related to the assessment material. There are two canned responses that the chatbot administrator can configure for this occasion: 1) if the chatbot pulls sources besides the assessment material then a response can be provided that includes those sources for the user to explore 2) if the chatbot does not pull from any other sources then it can simply respond by saying that the information was too closely related to the assessment material and it is not allowed to answer.
Quiz Features (in beta) - Academic chatbots have the option to enable quiz features. If the user uses keywords like quiz or multiple choice then this will trigger the chatbot to create an interactive quiz that the user can complete. Each question will include question specific citations in order to allow the user to easily access relevant information.
User questions will be logged on the Admin Console (see Disable question logging section if you choose to not log questions for review). In the console, the administrator will have the ability to see which sources were chosen as most similar to the question and which sources were chosen to create the answer. There is a tab for questions marked with a thumbs up or thumbs down, as well as questions that were known by the chatbot and those labeled as unknown. The administrator can also mark questions as reviewed or for testing purposes. They are also able to input feedback for the chatbot developers if there is an issue that requires developer attention. The developers receive a daily email with any feedback where they can address concerns.
On every question that user is able to select a thumbs up or down. If the user selects the thumbs down then a text box will appear that allows the user to provide written feedback based on the answer. This feedback will be available for the administrator in the Admin Console. This can help in determining when the chatbot may provide answers that users are not pleased with. It can also help administrators determine if there may be incorrect or inconsistent information in the provided content or websites.
In the Admin Console, administrators can add scripted questions entries which the bot can use to answer questions that may not be directly answered in the existing content. This can also help in providing more exact wording for sensitive information or clarifying information around costs or times. These entries allow for sample questions, sample keywords, and the answer associated with those.
There are two options for CAS integration with the chatbots.
The chatbot detects numbers related to university id’s and social security numbers and masks the data once the message is sent as well as when the data is stored and shown on the Admin Console.
Chatbots are automatically configured to have user questions stored in a database and presented on the Admin Console for administrators to review. However, if you have sensitive or private information, there is an option to forego saving question data. In this case, no question data will be shown in the Admin Console and administrators will not have the ability to review questions.
There are user analytics which show monthly metrics for any chatbots that you have access to view. These metrics include the amount of questions asked, amount of unknown questions, amount of known questions, amount of thumbs down questions, amount of thumbs up questions and amount of questions based on intent. The last metric allows administrators to determine which types of questions users are most interested in.
Data can be presented in three different formats for citation purposes. Data which sources need to remain private can be placed in a Private folder and are shown as Private Source when cited by the chatbot. Any sources placed in the Public folder will be cited with a link to that source for easy access. Sources not placed in either folder will be cited through the source title but a direct link will not be provided. This allows professors to give easy access to materials if needed but also keep any assessment material private.
If you have any questions on the information presented in this overview guide please contact the AI team at ese-sws@umd.edu. If you submit a request for a chatbot and are approved you will receive a more detailed user guide to guide you through establishing and making the most of your chatbot.