[wp-trac] [WordPress Trac] #65168: AI Client: Unable to reliably select model for providers (model preference ignored / inconsistent API)

WordPress Trac noreply at wordpress.org
Tue May 5 17:18:59 UTC 2026


#65168: AI Client: Unable to reliably select model for providers (model preference
ignored / inconsistent API)
--------------------------+-----------------------------
 Reporter:  jabir20       |      Owner:  (none)
     Type:  defect (bug)  |     Status:  new
 Priority:  normal        |  Milestone:  Awaiting Review
Component:  AI            |    Version:  trunk
 Severity:  normal        |   Keywords:
  Focuses:                |
--------------------------+-----------------------------
 Environment:
 - WordPress: 7.0-beta
 - AI Client: Core AI component
 - Provider: AI Provider for Google (Gemini)
 - Setup: Docker (local)

 Issue:

 When using the AI Client API, there is no consistent or working way to
 explicitly select a model.

 Attempt 1:
 Using ->model() results in a fatal-style error:

 "Method model does not exist on
 WordPress\AiClient\Builders\PromptBuilder."

 Attempt 2:
 Using ->using_model_preference('gemini-2.5-flash') does not reliably
 override the provider's default model.
 The provider still attempts to use "gemini-2.5-pro", resulting in quota
 errors (429).

 Example code:

 $result = wp_ai_client_prompt('Say hello')
     ->using_model_preference('gemini-2.5-flash')
     ->generate_text();

 Result:
 WP_Error with Gemini API 429 quota exceeded on model gemini-2.5-pro

 Expected behavior:

 - Developers should be able to explicitly set the model per request
 - The selected model should be respected by the provider
 - The API should provide a consistent method (e.g. ->model()) or
 documented alternative

 Actual behavior:

 - No stable API for model selection
 - Provider may ignore model preference
 - Leads to unexpected failures depending on provider defaults

 Notes:

 - The same API key works correctly when calling Gemini directly outside
 WordPress
 - This makes it difficult to build plugins that depend on predictable
 model usage
 - Also impacts multi-plugin environments where different plugins may
 require different models/providers

 Suggested improvement:

 - Provide a stable and documented method for model selection in AI Client
 - Ensure providers must respect explicit model selection
 - Optionally support per-plugin or per-request provider configuration

-- 
Ticket URL: <https://core.trac.wordpress.org/ticket/65168>
WordPress Trac <https://core.trac.wordpress.org/>
WordPress publishing platform


More information about the wp-trac mailing list