fix: Use LLM to generate unique scene prompts for video extensions#318
Open
crowwdev wants to merge 1 commit intochenyme:mainfrom
Open
fix: Use LLM to generate unique scene prompts for video extensions#318crowwdev wants to merge 1 commit intochenyme:mainfrom
crowwdev wants to merge 1 commit intochenyme:mainfrom
Conversation
- Integrate grok-4.1-fast to generate unique scene descriptions for each video round - Prevent scene repetition in 30-second videos by using different prompts per 6-second segment - Add _generate_scene_prompts_llm() in video.py for base video generation - Add _generate_scene_prompt_for_extend() in video_extend.py for manual extensions - Each scene now has natural progression without repetition Fixes chenyme#316
| async with aiohttp.ClientSession() as session: | ||
| async with session.post( | ||
| "http://localhost:8000/v1/chat/completions", | ||
| headers={"Content-Type": "application/json"}, |
There was a problem hiding this comment.
This api need api_key ;
api_key = get_config("app.api_key")
headers = {"Content-Type": "application/json"}
if api_key:
headers["Authorization"] = f"Bearer {api_key}"
| "model": "grok-4.1-fast", | ||
| "messages": [{"role": "user", "content": system_msg}], | ||
| "temperature": 0.8, | ||
| "max_tokens": 2000 |
| try: | ||
| async with aiohttp.ClientSession() as session: | ||
| async with session.post( | ||
| "http://localhost:8000/v1/chat/completions", |
| "model": "grok-4.1-fast", | ||
| "messages": [{"role": "user", "content": system_msg}], | ||
| "temperature": 0.8, | ||
| "max_tokens": 300 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Problem
When generating videos longer than 6 seconds, all extension rounds use the same prompt, causing scene repetition. For example, a 30-second video results in 5 nearly identical 6-second segments.
Related to Issue #316
Solution
grok-4.1-fastmodel to generate unique scene descriptions for each video round_generate_scene_prompts_llm()invideo.pyfor base video generation_generate_scene_prompt_for_extend()invideo_extend.pyfor manual extensionsTechnical Details
http://localhost:8000/v1/chat/completions)grok-4.1-fastTesting