Improve Data Stream Cleanup and Management with Enhanced Tracking and Sequential IDs#111
Open
Improve Data Stream Cleanup and Management with Enhanced Tracking and Sequential IDs#111
Conversation
…s, column value assignation with matematical alues, IN logic, string comaprison logic, value assignation in IF THEN ELSE conditions
…se value handeling for when we dont have enough data.
…opics funcitonality to clear allt he created data streams on closing.
- Ensure unique topics are deleted without duplicates. - Gracefully handle Kafka-specific exceptions like 'UNKNOWN_TOPIC_OR_PART' to avoid unnecessary errors. - Added proper logging for successful deletions and warnings for topics that don't exist. - Cleaned up created_streams list after successful deletion.
Collaborator
Author
|
The .env_kafka has been updated, need to accept this new version. |
rbardaji
reviewed
Oct 28, 2024
| KAFKA_PREFIX=data_stream | ||
| KAFKA_PORT=9092 | ||
| KAFKA_HOST=155.101.6.194 No newline at end of file | ||
| KAFKA_HOST=155.101.6.194 |
Collaborator
There was a problem hiding this comment.
Could you remove the IP please?
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This merge introduces significant enhancements to the data stream management and cleanup process, ensuring efficient resource utilization and better user control. Key changes include:
Active Consumers and Data Streams Tracking:
Sequential Integer-Based Stream IDs:
data_stream_1,data_stream_2).Enforced Maximum Stream Limit:
KAFKA_MAX_STREAMS).New Endpoint for Stream Deletion:
1,2) or by full topic name.Stream ID Reuse Mechanism:
Enhanced Kafka Topic Deletion:
These improvements streamline the data stream lifecycle, offering efficient resource management, reducing the potential for stream ID exhaustion, and providing greater control over stream creation and deletion processes.