Deep-lynx

Latest version: v0.1.8

Safety actively analyzes 682404 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 11 of 12

0.1.4

- Query operators < and > added
- Event system completely rewritten
- Queue systems built, Database and Rabbitmq systems implemented
- Dynamic graphql and resolver generation
- Test data generator UI portion

0.1.3

Deep Lynx was using a sophisticated database trigger for managing node and edge insertion. These triggers allowed the automatic setting of a `deleted_at` field whenever someone entered a record with an existing id. Unfortunately, this method proved to be extremely time consuming on the database side and caused numerous slowdowns on even the small tables.

This release removed those triggers and replaced them with different functions and functionality. We still have point in time versioning, and we will still be able to support point in time querying - but the `deleted_at` field will no longer be as important when viewing the data. That field now indicates manual or automatic deletion of records not relating to initial data ingestion.

0.1.2

Currently scanned version of Deep Lynx by INL internal Cybersecurity Team.

0.1.1

This release completes the work to bundle the Admin Web App with the main DeepLynx instance. See the wiki for more information - but this change should be fairly seamless.

0.1.0

This is the second major release of the DeepLynx data warehouse. In this release we focused entirely on the database layer, fundamentally changing how we interact with the underlying PostgreSQL database. We also laid the foundation for several future features. The changelist below highlights all changes made as part of this release.

See the attached PowerPoint presentation for motivation behind this release -
[Database Overhaul.pptx](https://github.com/idaholab/Deep-Lynx/files/7375215/Database.Overhaul.pptx)


- Completely wiped out all previous migration scripts. This is a breaking change, and it will be tagged as such.
- Converted all UUID `id` fields for all database tables into `bigserial` types. This will allow for better and faster indexing, as well as the ability to scale more gracefully. This will be especially important once we make changes to the nodes/edges table.
- Made any code changes necessary for supporting the `uuid` to `bigint` change. Fortunately, not much needed to change. Javascript cannot natively handle 64bit integers in its `number` type. Therefore, when working with the new id fields using `bigserial/bigint` we must continue to treat them as a string or risk loss of fidelity. This is beneficial to the program as a whole as very little changes needed to be made to handle the type change from `uuid` to `bigint`
- Introduced basic inheritance chain in metatypes and metatype relationships. You may now assign a parent/child combination using the *_inheritance join tables - actually incorporating inheritance at a code and HTTP server level will happen AFTER this overhaul has been completed so as to limit the amount of new features in this MR.
- Created trigger functions for insuring that you cannot add an invalid parent/child pair - e.g where the proposed child is actually a parent further up in the proposed parent's inheritance chain.
- Created functions for retrieving all keys of a metatype/relationship - both their own keys and inherited keys
- Created trigger functions for insuring uniqueness in `property_name` field on metatype/relationship keys even across inherited keys
- `archived` field now `deleted_at` in preparation of doing tuple, or point in time, versioning
- removed the internal fields for `graph` and `active_graph` this is internal only and will not affect outside services. The reasoning and completion of this will be handled in an MR to this branch.
- Converted all primary keys into `bigint` - this again didn't require too much code change as we have to represent `bigint` as strings in javascript due to the lack of ability to represent 64 bit integers in the number type.
- Paid special attention to data staging, insuring that this was also converted to bigint in order to handle the large amount of incoming data
- Type mapping transformations now contain the necessary fields for handling edges connecting to nodes based on a composite of their data source id, metatype id, and original id
- Nodes table converted to be a partitioned by range (created_at) table, created default partition
- Created trigger function for handling an insert when a node with the same composite id exists, or with the same id. This trigger function sets the old record's deleted_at field to the created_at field of the new record.
- Converted nodes update and delete functions to work with point in time versioning. Updates now insert a new record and rely on the trigger for setting the old record's deleted_at field to the created_at field of the new/updated record. Deleting a node will now simply set its deleted_at field to the current time.
- Edges table converted to partion by range (created_at), created default partition.
- Edge updates and deletes handled the same way as nodes now.
- Edge table updated to allow us to link edges either by node id, or a composite id consisting of a nodes original_data_id, metatype_id, and data_source_id.
- Export tables converted to work with the node/edge changes
- Converted event registration table to use bigint, made code changes to reflect that
- Migrations for event system included
- UI updated to work with DB changes, Transformation updated to include new keys for edge transformations

0.0.2

- Completely reorganized folder structure across the entire project. Have also renamed various folders above source such as `api_documentation` and `web_ui` to be more user friendly
- Majority of `data_storage` classes converted into `data_mappers`, with most domain complexity, validation, and other operations moved out.
- Created Repositories for the major data/domain objects. The repositories contain all the validation logic as well as allowing the user to communicate with the mappers.
- Subsumed all `filter` classes into the Repositories
- Created multiple new middlewares for the express.js server
- Converted the majority of hard types to domain classes
- Modified tests to include a cleanup step
- Converted Data Source and Exporter repositories into easier to maintain factory/interface pattern

Motivation and Context
In order to sustain this project over the long term we needed to adopt a different set of design patterns. This was prompted by the upcoming "versioning" feature as well as various discussions regarding project support and longevity.

Page 11 of 12

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.