Aiosow

Latest version: v0.1.8

Safety actively analyzes 638388 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 2

0.1.2

:wave: aiosow


<p align="center">
<img src="https://exorde-labs.github.io/aiosow/logo.png" width="350px" />
</p>

:seedling: Context & Origin

<p align="center">
<img src="https://camo.githubusercontent.com/6e230e4fa49a8ecf84133ffe3fe228b2429ab510d7572ada396f9efabd3d9423/68747470733a2f2f6d656469612e65786f7264652e696f2f6272616e642f6c616e6473636170652d6c6f676f2d636f6c6f722e737667" width="350px" />
</p>

It's 2023 and I'm tasked with building an architecture for the [Exorde Client](https://github.com/exorde-labs/ExordeModuleCLI).

The client has these constraints:
- it has to be in python (for data-scientists)
- it has to manage a lot of i/o
- it has to do heavy computations
- it has to communicate with a blockchain
- it has to be able to update itself

**`aiosow` by itself does not answer to any to those constraints**. In fact all of those are resolved by the exorde-cli it-self.

However, **`aiosow`** is what glues it all, and this document both it's origins and functionalities provided.

:dart: The scope

**What is failure for an software-architect ?**

Implementing someone's else code.

**What was preventing my team from contributing directly ?**

I was failing with providing an instantaneous answer to :
- **`where`** the code lives
- **`how`** to call it
- **`when`** to call it
- **`what`** parameters it should take

And that's a lot of questions to answer in a moving software.

**But if you are tasked an architect, it's your job figuring all of this out, not theirs.**

:flags: Separation of concern

Let's take a look at what's happening in the front-end industry : complex web-apps are now maintained by **teams** of developers.

How does this happen ?

Here are some of their concepts,

- *Components* are functions that handle layout
- *Actions* are functions that do something

So we have **two functions**:
- one to handle **rendering** (how it's displayed)
- the other is the **action** (what it does)

These two functions can live in their own files.We have distinct code-spaces, where a designer and a specialized technical developer can act.

The designer doesn't care about what the action does, all he knows is that he should call it.

So how does that happen ?

**You take a `component,` and you give it an `action`.**

You render a **`Button`**, which when clicked, triggers the **`action`** you provided.

js
const Button = (action) => <button onClick={action}>Click me !<button> this is written by the designer
const an_action = () => alert("youhou!") this is written by the technical-programmer
button_render = Button(an_action) this might be written by the product owner


:surfer: Functional Programming

Functional-programming is a technique where a program is designed to manipulate functions.

python
def a_function(the_function):
do_something_with(the_function)

> a_function does something with the_function

`aiosow` makes uses of the functional paradigm in order to create bridges that can replace the questions we had earlier.

:house: First use-case : Initialization

For example let's take the problem of initialization. Every problem might have a thing to do at this stage of the process.

But should every programmer have to look into the init, main.py or command.py ; in order to add something at initialization ?

When we let this happen,
- we create friction around specific files,
- contributor might get lost in the code-base, waste time or even fail
- problems are not solved in their own files, creating a natural mess
- we have to answer to the `where`, `when` question immediately

**How can the functional paradigm benefit us here ?**

:memo: Registers instead of space-constraint

Instead of putting up a space constraint to our team, we can provide them with function that will define how their code should be used.

This would allow the programmer to let code live where he believes is the most appropriate place instead of being tied to a specific place due to technical constraints.

**So how do we code this ?**

> main.py
python
SETUPS = []
def setup(function):
SETUPS.append(function)
return function

def init():
for function in SETUPS:
function()

if __name__ == '__main__':
init()

> some code somewhere
python
from main import setup

setup
def init_foo():
do something


The benefits are :
- That the contributor doesn't have to understand the setup process
- The setup.py file won't be modified for every new option
- Contributor doesn't know where the setup happens
- Contributor doesn't know when the setup happens

Model-Controller

In most software architecture, we make usage of the Model-Controller architecture.

This is what is being expressed when we split elements like we did earlier, but instead of being tied to concept like "model" and "controller",
they are tied to the specific problem they are solving.

- setup.py : solves the setup problem
- pictures : solves the pictures integration
- users : solves the users integration

This makes you consider code from the problem's perspective instead of technical constraints due to where code should live.

In fact, we replaced the **`where`** and **`when`** and even **`how`** question with a **`function`** that we can now communicate.

:warning: Decorators

Python provides [`decorators`](https://docs.python.org/3/glossary.html#term-decorator).

It's a syntax to write functional function directly applied at the function declaration.

python
my_decorator
def foo():
pass


Decorators seem like a nice feature, but **their convenience is a death kiss**

When you define what your code should do at the same place your implementation is defined, you are restricting the usage you could do of the function.

If you use any decorator on it, the function will have them bound no-matter what you do.

This problem is called `configuration as a side effect of importing`.

For this reason, `bindings` are usually separated from the `implementation`.

This allows us to re-use implementation without concern of potential side-effect.

:clock1: Routines

I needed the client to do some tasks for me regularly.

Wrote `routine` which was the base for most of the CLI.

It calls the specified function at the frequency specified. Turning the client into a daemon.

python
routine(frequency)(function)

This will run a function at specified frequency. Easy enough,

But on what would that action behave ? What parameters does it take ? How does it know what to do ?

:robot: State-Machines, expressing `what`

A state machine is a mathematical concept that can be used to model the behavior of a system. In programming, a state machine can be thought of as a set of rules that define how an object or system should behave based on its current state and any events that occur.

A state machine typically consists of a set of states, a set of events, and a set of transitions that define how the system should respond to each event based on its current state. The state of the system changes as events occur, and the system transitions from one state to another based on the rules defined in the state machine.

For example, imagine a vending machine that dispenses drinks. The state machine for this vending machine might have states such as "idle," "waiting for selection," and "dispensing drink." When a customer inserts money, the system transitions from the "idle" to "waiting for selection" state to the "dispensing drink" state. When the drink is dispensed, the system transitions back to the "idle" state.

<p align="center">
<img src="https://github.com/6r17/6r17/blob/main/state-machine-example.png" />
</p>

:musical_score: Expressing `How` by providing memory

To provide more context at run-time, we need to provide a memory to store the context

`do_this(based_on(that))`

> If you know Object programming you might be familiar with `that`, `self`, `this` etc...

- In order to make a `state-machine` we have to use a `memory` as source of truth.
- We can then use the keys just like we do when we code

Autofill

`autofill` is the process of reading a function prototype and using a dictionary as reference to populate the function on call.

It's a bit like if the computer already knew how to use your functions, but you actually program their usage trough their prototype.

To be clearer, the name of the variable represent the variable in `memory`

python
def demo_function(foo):
return foo

mem = {}
autofill(demo_function, args=[], memory=mem) -> None

mem = {'foo': 'bar'}
autofill(demo_function, args=[], memory=mem) -> 'bar'



:fireworks: Example : The vending machine

**Let's add some fun "real-life" context to it,**
We need to build a controller for our vending machine, but we might want to change prices, or we might have to change a piece of the machine and have to switch drivers, in any case, right now we don't have the said drivers (let's say they are coming up) - and we still need to have the software ready for when the machine will ship. The constructor promised the driver will be inside and will be as easy as calling an API, or maybe a process, but he just doesn't remember which.


Ok let's write a naive implementation, it isn't meant to fully implement all possible cases. The model is changed a little bit in regards to what i showed you earlier but the same basics apply. For instance, instead of making the user pay and select, I made them select & then pay

> __implementation__.py
python

def return_money(amount, received_money):
call some api here
return { "received_money": received_money - amount }

def dispense_drink():
return { "selection": None, "state": "idle" }

def selection(selection): we call this with the selected item
return {"selection": selection}

def reset():
return { "state": "idle", received_payement: 0, selection: None }

def receive_money(money, received_money): we call this whenever we received money
return { "received_money": received_money + money }

def proceed_payment(selection, prices):
return { "state": "payement", price: prices[selection] }

def check_account(selection, prices, received_money):
if received_money > prices[selection]:
return_money(received_money - prices[selection])
return { "state": "paid" }

> __bindings__.py
python

at start, the machine should be idle
setup(reset)

then the user can do a selection
on('selection', condition: lambda selection: selection)(proceed_payement)
if there is no selection we should return the money
on('received_money', condition=lambda selection: not_selection)(return_money)
on('received_money', condition=lambda selection: selection)(check_account)
on('state', condition: lambda state: state == 'paid')(dispense_drink)

**So what is happening here ?**

Let's look at how the memory would behave while we iterate trough the process

- **first, the setup function resets the memory**
python
{
"state" : "idle", "selection": None, "received_payement": 0
}

- **Then user proceed with selection, we call `selection()`**

python
{
"state": "idle", "selection: "Aiola"
}

Notice that the machine is still in "idle" state, we don't really need to change it's `state` because `{ state }` is different from `{ state, selection }`

- **The user then proceed with putting 40c which is not enough**

pyhton
{
"state", "idle", "selection": "Aiola", "received_payement": 0.40
}

check_account is trigger every-time `received_payement` is updated and there is a selection
`on('received_payement', condition=lambda selection: selection)(check_account)

- **The user proceed with putting the last coin**
python
{
"state", "idle", "selection": "Aiola", "received_payement": 1.40
}

- **`received_payement` is triggered again,** it is now enough and it transition to giving the can

python
{
"state", "paid", "selection": "Aiola", "received_payement": 1.40
}

- **dispense_drink is called whenever we transition to the `paid` state**, this is express with :
`on('state', condition: lambda state: state == 'paid')(dispense_dink)`

The condition pass, it gives the can and resets the state back to original
python
{
"state": "idle", selection: None, "received_payement": 0
}


> :information_source: Implementation won't have any concern of the framework, but it will have concerns of surrounding code. The function prototype do have their meaning and this is somewhat of a game changer to how we are used to program.

:fireworks: Example : Remote shut-down

Ok, this is where the state-machine kicks in. Let's imagine a simple problem. We need to be able to shut down the software at distance.

What we can do is fetch a file regularly, and if the value of "online" is set to False, then we exit the process.
How do we express this ?

Let's do it the classic way first,

python
def shut_on_fetch(url):
result = fetch(url)
if result['online'] == False:
sys.exit()

Now you wrote your implementation, but where does this code live ? what does call it ? when is it called ?

You are going to end up writing a thread with a consumer, or if in case of asynchronous, the same code as what's in aiosow with a task.

Now let's run it with `aiosow`, it is built on top of an asynchronous task loop, which allows it to run the code without blocking on any part.

python
setup(lambda: { "url": "some_url" }) result of functions that are dict can be perpetuated in the memory

def fetch(url) -> dict :
return { "online": "true" | "false" } imagine this to be True and then to change to False

routine(5*60)(fetch)

on('online', condition: lambda online: not online)(lambda: sys.exit()) triggered when "online" is saved in memory


> So what does this do ?
- It will run `fetch()` on `setup` and every 5 minutes. `fetch()` will perpetuate the content of the result in memory.
- when `online` is set in memory, we trigger a lambda which exits.

<br/><br/><br/>

:rocket: What's next ?

Before we dive into the full detail of the release, we need to address what's missing or problematic before you might want to play with it.

:skull_and_crossbones: **Traceback is garbage**
This is the most problematic and will be immediately addressed in the coming releases. Current is completely unreadable. We won't be able to provide a traceback as detailed as when we do procedure based (tough that might be just worked on I believe) - but we will be able to provide a decent message.

:warning: **No error helpers**
We should warn users when they do something wrong and we can obviously tell so. Such as colliding alias and variable names.

:droplet: **Memory safety**
We have no test that guarantee no memory leaks.

:placard: **Standardization**
Since this is a completely new framework, there is no standard nor real experience with it's usage. It's completely built on the fly and requires more hacking to build toward a stable vocabulary.

:speech_balloon: Contributions

Since `aiosow` is dealing with relatively isolated concepts, contributions can be done really easily trough new vocabulary.

I'd like to address both to projects who need a stable vocabulary, and to people who like to fiddle and experiment.

To do `aiosow` will manage two branches of the same project,

:muscle: **stable**: Which goal is to avoid any breaking change

:stuck_out_tongue_closed_eyes: **free**: Which goal is to provide a open space for experimenting

- The process of integrating a new vocabulary in **stable** would require a feature to be tested thoroughly in the dedicated open space.
- Changing stable vocabulary in free is banned, as soon as a function passes in the stable branch is is maintained by the stable process
- Stable process is mostly about guaranteeing that a feature is functioning on long term and is appropriately included.

These processes aren't really defined yet so you are all welcome to share your ideas of the projects in the discussions.

`aiosow` does not have a `stable` and `free` branch yet, those will appear with the 1.0.0 release ; which will be the first stable release.


Until then, **alpha version** of `aiosow` is available on pypi, and is maintained on the `main` branch.

<br/><br/><br/>



:gift: The release

- [website](https://exorde-labs.github.io/aiosow/aiosow.html)
The website is a github-page generated by reading the docstrings of the functions on the main branch.

- :phone: bindings.alias(name: str)
The `alias` function is a decorator that can be used to inject a value for a function argument that will be passed to the `autofill` function.

When a function is decorated with `alias`, the arguments with the given name is aliased and its value will be replaced with the value returned by the aliased function when calling the `autofill` function. The original function can still be called normally, but when calling the `autofill` function, the arguments value will be determined by the registered alias. The registered alias function will be called and its result will be used as the argument value.

**Args**:

`name (str)`: The name of the argument to inject a value for.

**Returns**:

`Callable`: A decorator that takes a function as input, registers it to have an injected argument with the given name, and returns the function unchanged.

**Example**:
python
def get_user_id():
return 123

alias(name='user_id')
async def my_function(user_id):
code here

When calling the autofill function, the `user_id` argument will be
automatically injected with the value returned by the `get_user_id`
function.
get_user_id is called right before my_function.
await autofill(my_function, args=[], memory={}) user_id=123


- :package: bindings.accumulator(size: Union[int, Callable]) -> Callable:
Batch the calls to a function. Triggers it when the bucket size is reached. If the size passed is a Callable, `accumulator` will call it with memory to get the size.

python
accumulator(size=10)
async def process_batch(batch):
code here


- :magic_wand: bindings.autofill(function: Callable, args: Any = [], **kwargs) -> Any:
The `autofill` function takes a callable function, args, and memory as input arguments, and returns the result of calling the function with autofilled arguments.

**Args**:

- `function (Callable)`: The function to be called with autofilled arguments.
- `args (Any)`: A list of positional arguments to pass to the function.
- `memory (Any)`: A dictionary of keyword arguments to pass to the function.

**Returns**:

- `Any`: The result of calling the input function with autofilled arguments.

**Notes**:

- If the input function has default arguments, the corresponding values will be used if the input args and memory do not provide values for them.
- If the input function has been decorated, this function will unwrap the original function and use its signature to determine the arguments.
The autofill function is a utility function that can be used to call a function with autofilled arguments. It takes a Callable function, a list of args, and a dictionary of memory as input, and returns the result of calling the function with autofilled arguments.

If the function has default arguments, autofill will use the corresponding values if the input args and memory do not provide values for them. If the function has been decorated, autofill will unwrap the original function and use its signature to determine the arguments.

**Example**:
python
await autofill(function, args=[], memory={'a': 1})

async def some_function_called_by_aiosow(memory):
await autofill(function, args=[], memory=memory)


- :clock1: bindings.delay(seconds: float) -> Callable:
Makes sure the function takes at least seconds to run. It delays the execution of an asynchronous function by `seconds - exec_time(function)` where `seconds` is a fixed delay time in seconds and `exec_time(function)` is the time taken by the wrapped function to execute.

**Args**:

- `seconds`: The fixed delay time in seconds.

**Returns**:

- A decorator that can be used to wrap an asynchronous function.

**Example**:

python
delay(2.5)
async def my_function():
code here


- :probing_cane: bindings.debug(trigger: Callable[[Exception, Callable, Tuple], Any]) -> Callable:
Return a decorator that calls `trigger` when the decorated entity raises an error.

`trigger` is called with:

- `error`: Exception
- `function`: Callable
- `args`: Tuple

**Args**:

- `triggered`: Callable

**Returns**:

- `decorator`: Callable

**Example**:

python
function_to_debug = debug(pdb)(function_to_debug)


- :loop: bindings.each(iter: Optional[Callable] = None):
Applies a function to each item in:

- result of `iter` or
- first argument passed to the resulting function

**Args**:

- `iter`: The async generator to iterate over.

**Returns**:

- A decorator that applies a function to each item returned by the given `iter`.

- :ghost: bindings.make_async(function: Callable) -> Callable:
Make a synchronous function run in its own thread using `run_in_executor`.

**Args**:

- `function`: Callable

**Returns**:

- Async Callable

- :satellite: bindings.on(variable_name: str, condition: Optional[Callable] = None, singularize=False):
Decorator function that registers a function to be executed when a variable of specified name is perpetuated in memory.

**Args**:

- `variable_name` (str): Name of the event to listen to.
- `condition` (Callable|None, optional): Callable object that takes the value of the event as input and returns a boolean indicating whether the registered function should be executed or not. Defaults to None.
- `singularize` (bool, optional): Boolean indicating whether the value of the event should be singularized before being passed as an argument to the registered function. If True, the value must be iterable. Defaults to False.

**Returns**:

- The decorated function.

- :speech_balloon: bindings.option(*args, **kwargs):
Register an argparse option to be available using the command-line.

**Args**:

- `name`: str -> the name of the option (will be used as f'--{name}')
- `kwargs`: dict -> the options used by argparse

- :eagle: bindings.pdb(*__args__, **__kwargs__):
Launches `pdb.set_trace()`, a utility function for `aiosow.bindings.debug`.

- :floppy_disk: bindings.perpetuate(function: Callable, args: Any = [], memory: Any = {}) -> Any:
Asynchronously executes a function and perpetuates its effects in memory.

**Args**:

- `function` (Callable): The function to be executed.
- `args` (Any, optional): Positional arguments to be passed to the function. Defaults to an empty list.
- `memory`: The memory

**Returns**:

- The mutated keyword arguments of the executed function.

- :construction: bindings.read_only(something: Any) -> Callable:
Wraps a value in a function.

**Args**:

- `something`: Any

**Returns**:

- A function that returns the wrapped value.

- :house: bindings.setup(func: Callable) -> Callable:
Decorator to add a function to the list of initialization functions.

**Args**:

- `func` (Callable): Function to add to the list of initialization functions.

**Returns**:

- `func` (Callable): The same function, unchanged.

- :pencil2: bindings.wrap(wrapper_function: Callable):
Wraps the result of a function with a given `wrapper_function`.

**Args**:

- `wrapper_function`: The function to use for wrapping the result of the decorated function.

**Returns**:

- A decorator that wraps the result of a function with the given `wrapper_function`.

- :mountain_cableway: bindings.wire(perpetual=False, pass_args=True) -> Tuple[Callable, Callable]:
Returns a tuple of two decorators: trigger_decorator and listen_decorator.

The trigger_decorator decorator wraps an async function and triggers it, calling any functions registered with the listen_decorator decorator. The listen_decorator decorator registers a function to be called whenever a function decorated with trigger_decorator is called.

**Args**:

- `perpetual` (bool, optional): Whether to register the wrapped function to be called continuously. Defaults to False.
- `pass_args` (bool, optional): Whether to pass arguments to the listen functions. Defaults to True.

**Returns**:

- A tuple of two decorators: `trigger_decorator` and `listen_decorator`.

**Example**:
python
wire_trigger, wire_listen = wire()

wire_listen
def my_function():
print("my_function called")

wire_trigger
async def my_async_function():
print("my_async_function called")

await my_async_function()

will result with

> my_async_function_called
> my_function_called

Page 2 of 2

Links

Releases

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.