After completion of the UOOWUI server setup, One should now be able to login to the Open-WebUI setup page via your host http://SERVERIP:8080 in a web browser from another device on the same LAN.
You should be greeted with this page. Click get started to create the admin account.
I used admin@ai.local with password admin. The account you create does not need to be an active email account as this is a local AI. You will use this email address to login as an administrative user.
After the announcement page, this is the first interface screen you will see. Starting top left you have your bars for menu items. Next to that is the selected model (LLM) being used for AI. I have updated the LLM to Gemma3 Abliterated. A rather small scaled LLM to help conserve image space.
Clicking on the menu bars opens up the Chat / Workspace menu items. Here you can create a new chat and return to the AI chat window and other conversations.
Clicking on your avatar, yellow circle with your initials in it top right, will activate user menu choices and admin menu choice if the user is an administrator. Some times the avatar appears in the bottom left when the menu bars are activated.
Let's go to the Admin Panel !
The admin panel opens up to the user management area. You can add and manage users from here. Clicking the Role button of the user will change the user's role. Notice Users is highlighted in the top menu, these are your admin menu items. LDAP is an option you can configure for your users.
GENERAL ADMIN SETTINGS
If we click on the Settings, in the admin menu item, we are sent to the General sub menu of the admin configuration page. Adjust these settings based on your preferences. You may see Settings sub menu appear either on the left or below the Settings menu. Remember to click SAVE at the bottom of the page to commit the changes.
Let's click on Connections next.
This area is where we connect to LLMs. Several settings are also available for other online services. What is shown here is for local Ollama AI instance only. The Ollama instance should be found automatically. If not, check if the service is running and verify the connection information by clicking on the configure GEAR icon. You can manage models for each connection by clicking on the PENCIL icon. Let's move on to the Models sub menu.
Remember to click SAVE at the bottom of the page to commit the changes.
This panel lists all the available Models to Open-WebUI. You can turn them on or off and edit the name and features of each Model by clicking the Pencil. Go ahead, click the Pencil.
These are all the features of the LLM that you can edit from within Open-Webui. Clicking on the small circled square in the OI logo lets you change the icon for the LMM so its easy to identify. Visibility allows other users to see the Model. Model Params lets you configure settings on the LLM model like #of CPUs / GPUs, Temperature, frequency penalty, etc. Remember scroll to the bottom of the page to Save & Update !