Skip to content

Conversation

@Claude
Copy link

@Claude Claude AI commented Feb 9, 2026

Complete Fix for All NoneType Errors

Root Cause Analysis

The error "'NoneType' object has no attribute 'get'" was occurring because:

  1. __user__["id"] accessed without None check at line 2407 and 1571
  2. __event_emitter__() called without None checks in 15 locations
  3. Type hints didn't reflect reality - parameters marked as required but can be None

Comprehensive Fix

  • Fixed __user__["id"] access in pipe() method (line 2407)
  • Fixed __user__["id"] access in _upload_image_with_status (line 1571) with fallback
  • Wrapped all 15 __event_emitter__() calls with None checks
  • Updated all type hints to Optional where parameters can be None
  • Bumped version to 1.11.4
  • Validated Python syntax

Changes by Method

pipe() method:

# Before (crashes when __user__ is None)
self.user = Users.get_user_by_id(__user__["id"])

# After (safe)
self.user = Users.get_user_by_id(__user__["id"]) if __user__ else None

_upload_image_with_status:

# Added early return with fallback when __user__ is None
if not __user__:
    self.log.warning("No user provided, using data URL fallback")
    return f"data:{mime_type};base64,{image_data_b64}"

All event emitters:

# Before (crashes when None)
await __event_emitter__({...})

# After (safe, follows Azure pipeline pattern)
if __event_emitter__:
    await __event_emitter__({...})

Type hints updated:

  • __metadata__: dict[str, Any]Optional[dict[str, Any]] = None
  • __event_emitter__: CallableOptional[Callable] = None (6 methods)
  • __user__: dictOptional[dict] = None (1 method)

Pattern Followed

This fix follows the established pattern from the Azure AI Foundry pipeline, which consistently checks if __event_emitter__: before all calls.

Original prompt

This section details on the original issue you should resolve

<issue_title>After configuring the Gemini pipeline, chatting with the LLM doesn't work and it shows an error message</issue_title>
<issue_description>### Question summary

I am not able to make the pipeline work. When I chat with Gemini I get this error: "'NoneType' object has no attribute 'get'". Am I missing something in the pipeline config?

Details

I have followed the steps in the README file. I added the pipeline code as a new function, then I added the WEBUI_SECRET_KEY parameter in the open webui container, recreated the container, then I added my google api key in the valves config, and finally I enabled the function.

I am able to see the gemini models listed in the Models section of the Admin Panel's Settings, but when chatting with any gemini model, I get this error:

  • 'NoneType' object has no attribute 'get'
Image

The logs indicate this:
DEBUG | open_webui.main:process_chat:1746 - Error processing chat payload: 'NoneType' object has no attribute 'get'

Is there a configuration I am missing? I reviewed all the valves configurations and everything seems to have their own default value except for the Api Key (which I already added and shows encripted).

What have you tried?

I've tried different gemini models with the same result (the error message). When using the openai url of gemini as a normal connection in opne webui, it works correctly. </issue_description>

<agent_instructions>I cannot reproduce the error. Check where this error might occur and implement a fix, and update the documentation if necessary.</agent_instructions>

Comments on the Issue (you are @claude[agent] in this section)


💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

When no filters are installed or Open WebUI doesn't provide the __metadata__
parameter, it can be None, causing "'NoneType' object has no attribute 'get'" error.

Added defensive check: metadata = __metadata__ or {}
This ensures metadata is always a dict, even when __metadata__ is None.

Co-authored-by: owndev <69784886+owndev@users.noreply.github.com>
@Claude Claude AI changed the title [WIP] Fix error in Gemini pipeline when chatting with LLM Fix AttributeError when using Gemini pipeline without filters Feb 9, 2026
@Claude Claude AI requested a review from owndev February 9, 2026 06:47
@Claude Claude AI changed the title Fix AttributeError when using Gemini pipeline without filters Fix NoneType error in Gemini pipeline when __metadata__ is None Feb 9, 2026
…eError

The error "'NoneType' object has no attribute 'get'" was still occurring
because file_obj.meta could be None when accessing the content_type.

Added defensive check: mime = (file_obj.meta or {}).get("content_type", "image/png")
This ensures file_obj.meta is treated as an empty dict when None.

Bumped version from 1.11.2 to 1.11.3

Co-authored-by: owndev <69784886+owndev@users.noreply.github.com>
@Claude Claude AI changed the title Fix NoneType error in Gemini pipeline when __metadata__ is None Fix NoneType errors in Gemini pipeline metadata and file handling Feb 9, 2026
@owndev
Copy link
Owner

owndev commented Feb 9, 2026

@codex[agent] The problem still exists: 'NoneType' object has no attribute 'get'

@owndev
Copy link
Owner

owndev commented Feb 9, 2026

@copilot The problem still exists: 'NoneType' object has no attribute 'get'

Copy link
Contributor

Copilot AI commented Feb 9, 2026

@copilot The problem still exists: 'NoneType' object has no attribute 'get'

I've identified and fixed the root causes. The issue was that __user__ and __event_emitter__ parameters can be None, but the code was accessing them without defensive checks in multiple locations.

Fixed in this commit:

  1. Added None check for __user__["id"] access (line 2407)
  2. Added fallback behavior in _upload_image_with_status when __user__ is None
  3. Wrapped all 15 __event_emitter__() calls with None checks
  4. Updated type hints to Optional to reflect reality

The fixes follow the Azure pipeline pattern of checking if __event_emitter__: before all calls. This should resolve all remaining NoneType errors.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

After configuring the Gemini pipeline, chatting with the LLM doesn't work and it shows an error message

3 participants