Rambot

Latest version: v0.1.2

Safety actively analyzes 723217 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

2.0

app.run()


These improvements make the scraper easier to configure and more adaptable to different environments.

---

This version enhances clarity and provides a more structured flow, emphasizing each key update and how it contributes to the framework's overall flexibility and efficiency.

1.0

app.run()


**2. Updating Configuration Dynamically**

python
app.update_config(proxy="http://new-proxy.com")
app.update_config(block_images=True)


**3. Combining Both Approaches**

python
app = SkipTheDishesScraper()
app.config_driver(headless=True, lang="en-US")

0.1.2

**Introduction**

This release brings significant improvements aimed at enhancing flexibility, maintainability, and error traceability within the scraper framework. Key highlights include dynamic logging configurations, centralized exception handling, and greater flexibility in driver configuration. Below is a detailed breakdown of the updates:

**Key Updates:**

- **Logging Updates**: Enhanced log management with dynamic class name insertion for better traceability.
- **Exception Management**: Centralized exception handling through a new class, enabling more granular control over exceptions and logs.
- **Driver Configuration**: Improved flexibility to configure and update the scraper's driver parameters dynamically.

---

**Logging Updates**

**1. Dynamic Class Name in Log Format**

The log format has been enhanced to dynamically insert the name of the class that instantiated the `Scraper` class. This allows log messages to carry more meaningful information, making it easier to trace the source of the logs.

python
current_format = "<green>{time:YYYY-MM-DD HH:mm:ss}</green> - <red>{class_name}</red> - <level>{level}</level> - <white>{message}</white>"


---

**2. `update_logger_config` Function Enhancement**

The `update_logger_config` function now accepts a dynamic `class_name` parameter, allowing the log format to be adjusted based on the subclass of `Scraper` that is generating the log messages.

python
def update_logger_config(
class_name: str = "Scraper",
log_to_file: bool = False,
file_path: str = "app.log"
):
format_str = current_format.replace("{class_name}", class_name)


This update ensures that each subclass of `Scraper` logs messages with its specific class name, improving log traceability.

---

**Summary of Changes:**
1. **Dynamic class name in log format**: Log format now dynamically adjusts to include the name of the calling class.
2. **`update_logger_config` enhancement**: Function now accepts a `class_name` parameter for flexible log formatting.





**Exception Management Updates**

**1. Context and Objective**

Flexible exception handling has been integrated into the scraper framework to improve error management. This new approach enables more configuration options and enhances error tracking via logs. The handling is now centralized in the newly introduced `ExceptionHandler` class, with methods to configure which exceptions should be raised and logged.

**2. Modified Classes and Methods**

**a. `ExceptionHandler` Class**

The `ExceptionHandler` class centralizes exception handling by logging errors and deciding whether exceptions should be re-raised based on configuration.

python
class ExceptionHandler:
def __init__(self, must_raise_exceptions: typing.List[typing.Type[Exception]] = [Exception]):
self.logger = logger
self.must_raise = must_raise_exceptions

def handle(self, e: Exception) -> None:
frame = inspect.currentframe().f_back
function_name = frame.f_code.co_name

error_message = f"Error in {function_name}: {str(e)}"
traceback_details = traceback.format_exc()

self.logger.warning(f"{error_message}\n{traceback_details}")

if any(isinstance(e, exc_type) for exc_type in self.must_raise):
raise e


**b. `Scraper` Class**

The `Scraper` class has been modified to integrate the `ExceptionHandler` for dynamic exception management. The new `config_exceptions` method allows customization of which exceptions should be raised and logged.

python
class Scraper:
def config_exceptions(self, must_raise_exceptions: typing.List[typing.Type[Exception]] = [Exception]) -> None:
self.exception_handler = ExceptionHandler(must_raise_exceptions=must_raise_exceptions)

**Advantages of This Approach:**
- **Flexibility**: Easily configure which exceptions to raise without altering the core logic.
- **Centralized Error Management**: All error handling logic is centralized in `ExceptionHandler`, improving maintainability.
- **Dynamic Exception Handling**: With `config_exceptions`, exception handling can be customized at runtime based on the scraper's needs.


**3. Using `config_exceptions`**

python
scraper = Scraper(config=ScraperConfig())
scraper.config_exceptions(must_raise_exceptions=[ValueError, KeyError])


This configuration will raise `ValueError` or `KeyError` exceptions and log errors accordingly.


**Possible Future Improvements:**
- **Notifications**: Extend `ExceptionHandler` to send notifications for critical errors (e.g., via email or Slack).
- **Additional Log Types**: Introduce more log levels (e.g., `INFO`, `ERROR`) for different exception types.





**Driver Configuration Method Updates**

**Summary**

To enhance flexibility and maintainability, the `Scraper` class now includes the following updates:
- **Improved `config_driver` Method**: Customize important parameters such as `headless`, `proxy`, and `user_agent`.
- **New `update_config` Method**: Dynamically modify configuration parameters without resetting the entire setup.

**Usage Examples**

**1. Initializing the Scraper with Custom Configuration**

python
app = SkipTheDishesScraper()

0.1.1

0.1.0

🚀 **Rambot** is a Python package designed to simplify HTTP requests and web scraping. This first version establishes the foundation of the project with a modular structure and essential features.

✨ **Key Features**
- 📡 **HTTP request management** via `rambot.http.requests`
- 📂 **Database module** for SQLAlchemy interactions
- 🕵️ **Built-in scraper** with data extraction tools
- 🛠️ **Modular architecture** for easy expansion

🔧 **Installation**
sh
pip install --upgrade rambot


🛠️ **Upcoming Improvements**
- 📌 Adding unit tests
- ⚡ Performance optimizations
- 🔗 Support for more data formats

Links

Releases

Has known vulnerabilities

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.