Platform to
The CXOs guide to effective software lifecycle transformations
Executive Summary
Qentelli LLC. (Qentelli) thanks Leonardo247 for providing us the opportunity to respond to the RFP for the Loss-Run Datamart.
We understand the objective of the project is to allow Leonardo Executives, Managers and decision-makers access to specific data available and quickly access critical insights without wasting time searching through an entire data warehouse.
Engineering Excellence – Architecting and developing a high-quality Datamart solution can be easily scaled and extended through intelligent and thoughtful design, high code quality standards and modern Software Engineering practices
Collaborative culture – Co-creation of Business Value through close collaboration with Business and IT stakeholders at Leonardo through the following shared values:
1. Build Trust through transparent communications
2. Fix issues, learn from them and move quickly – blameless post-mortems
3. Track, learn, Calibrate and Continually improve
Operations mindset – Keep in mind the complexity of operating the system, during the design phase. A complex system with bells and whistles is fun to design and develop but can be difficult to operate
Company Overview
Qentelli is a technology company that accelerates digital and cloud transformation journeys through implementation of DevOps, Automation, Agile transformation, AI and Deep learning.
We help clients to deliver software faster, more efficiently and affordably. Qentelli is headquartered is in Dallas, TX with a global presence. The services teams are powered by the Innovation Group that provides the thought leadership, problem-solving and tools and plug-ins needed for modern applications.
We believe that our proven expertise in Digital Transformation, makes Qentelli the vendor of choice for this initiative. Qentelli Executive Leadership team assures Leonardo247 that it fully aligns with and will go the extra mile to help achieve the above objectives.
Solution Overview
To meet the goals for this RFP, Qentelli will build a fully Managed Software Development Services group, to function as an extension to Leonardo247’s IT team. The Qentelli team will work in close collaboration with the various stakeholders involved in this project. The Qentelli teams will formulate and establish a set of standardized processes, methodologies for the various stages of engineering life cycle, recommend technologies and tools, CI CD best practices and quality check gates to improve the overall quality of code delivered by the teams. To ensure success Qentelli will introduce a robust Governance model to enable monitoring and control for the overall engagement.
Below is representation of the logical model of the solution designed for the DataMart.
The above diagram shows the happy path scenario when the uploaded files match the Common Data Model and processing that is setup. Each lambda handles a specific file (Insurance Carrier, Property and other parameters necessary for that file).
As part of the RFP process, Qentelli Solution architects reviewed the scope and the responses to the questions we posted in designing the solution. We request the solution to be reviewed as a reflection of our thought process with the limited knowledge we have at this point – the final solution will evolve during the project.
We assure Leonardo that we will have full ownership and accountability in ensuring that the Solution meets the objectives from functional, cost and quality perspectives.
Organizational Overview and Technical Strengths
Qentelli LLC. is a Digital and Cloud Company headquartered in Dallas, Texas. The core of the offerings we bring to our customers is all things automation within the engineering lifecycle, minimizing the need for human intervention and improving the agility and velocity of the overall application engineering lifecycle, thus enabling Continuous Delivery.
What we do is summarized in the infographic below:
6 Patents, 9 tools, a book on Continuous Delivery all speak for themselves when it comes to Thought Leadership, we bring in Quality Engineering, CI/CD and DevOps.
About Client
Founded in 1950, it is a network of young chief executives with approximately 24,000 members across 130 countries. Qentelli has partnered for an Engineering Transformation where we have re-architected the entire legacy portal.
Tech-stack:
Restful APIs
.NET Stack
Azure Cloud platform
VUE.JS
CI / CD Enablement
Jane.ai
Requirement
UI / UX Refresh – Re-architecture of entire portal with focus on user membership at the core
Omni-platform experience – Leverage modern technologies to ensure cross-browser and multi-screen compatibility across devices
Automation and DevOps – Enable application development, security, infrastructure as code, and operations into a continuous, end-to-end, highly automated delivery cycle
Solution Highlights
Re-architected the entire backend using – Microservices, Cloud-enablement, and Serverless architecture
Designed the User Stories with member centricity
Data store implementation for microservices and golden SQL synchronization
Architecture redesign from scratch with an event driven approach
Modular UI designs across – Application screens, Application actions and Application components and configuration
Artificial Intelligence using Jane.ai that consumes data across the various data sources in the application to enable predictions and recommendations for the members of the portal
Benefits
High – Available environments – enabling business continuity
Increased number of deployments that helped decrease the change failure rate
Analytics Driven Design - Covered all member interactions and formerly unstructured sources into a useful, actionable format to optimise customer experiences
Reduction in operational cost - Introduced on-demand environments
Member centric Approach – Enhanced member experience consistently across all touchpoints and channels of interaction
About Client
Founded in 1950, it is a network of young chief executives with approximately 24,000 members across 130 countries. Qentelli has partnered for an Engineering Transformation where we have re-architected the entire legacy portal.
Tech-stack:
Restful APIs
.NET Stack
Azure Cloud platform
VUE.JS
CI / CD Enablement
Jane.ai
Requirement
UI / UX Refresh – Re-architecture of entire portal with focus on user membership at the core
Omni-platform experience – Leverage modern technologies to ensure cross-browser and multi-screen compatibility across devices
Automation and DevOps – Enable application development, security, infrastructure as code, and operations into a continuous, end-to-end, highly automated delivery cycle
Solution Highlights
Re-architected the entire backend using – Microservices, Cloud-enablement, and Serverless architecture
Designed the User Stories with member centricity
Data store implementation for microservices and golden SQL synchronization
Architecture redesign from scratch with an event driven approach
Modular UI designs across – Application screens, Application actions and Application components and configuration
Artificial Intelligence using Jane.ai that consumes data across the various data sources in the application to enable predictions and recommendations for the members of the portal
Benefits
High – Available environments – enabling business continuity
Increased number of deployments that helped decrease the change failure rate
Analytics Driven Design - Covered all member interactions and formerly unstructured sources into a useful, actionable format to optimise customer experiences
Reduction in operational cost - Introduced on-demand environments
Member centric Approach – Enhanced member experience consistently across all touchpoints and channels of interaction
As per our understanding, the following is the scope of the project, considered for our Solution Design
- Design a data mart where files from Insurance carriers will be loaded into a Common Data Model (CDM).
- If new columns are introduced that are not previously mapped or if there are issues during data load/mapping, the system will inform the uploading user and flag it for manual review by the administrator
- UI Portal to upload files, map data, view reports and perform analytics
- Meta data driven approach to map source-> target columns, identify new columns, etc.,
- Input files would be in Excel, CSV and PDF. Incident files will also be provided in Excel
- Files would be loaded every 6 months or can be on-demand at a higher frequency
- Format of the files will be standard for each carrier
- Each file would have information regarding the property, sub-property, claim details, include claim amount
- As column headings could be different, data mart to have a metadata approach to standardize/cleanse columns as far as possible (per carrier)
- Exception handler while data load which will have UI for various types of Exceptions:
- Any file format that is not in Excel, CSV, PDF
- If same # of columns are found but column names are different : this can be due to new column/replaced column
- Wrong data is stored by data type (i.e., alpha numeric in data field)
- Duplicate claims
- Though we can have loss run report for a property for very long duration, Data mart is expected to capture only for first 5 years (i.e., if the data comes for beyond 5 yrs, that can be handled like any other exception
- Store data that are not in the main data model (unmapped data, data beyond 5 yrs, etc.,) for future use
Out of Scope
- AWS account and infrastructure for Dev, Test, Staging and Production environments will be provided by Leonardo247
- Non-functional tests such as Performance and Security
- Backup and DR implementation and testing
- User Acceptance Testing (any defect fixes from UAT will be addressed)
- No new Identity and Access Management system will be designed for this platform – existing implementations such as Okta or Cognito will be re-used if possible
Qentelli will use the Well-Architected Framework published by AWS for the Technical Architecture and Development. Some of the best practices from our experience are also incorporated in the above Solution Architecture, as described below:
- Use of AWS-native services as much as possible to reduce Infra & operational complexity
- Decentralized Data Management
- Smart Endpoints and Dumb Pipes
- Loose Coupling and High Cohesion
- Integrated Authentication
- Eventual Consistency
- High Fault Tolerance
Key Technology components used in the Architecture:
- Serverless architecture – the platform usage is not continuous and will rarely hit peak loads. During certain periods in the year, there might be a need to scale the architecture
- React-based UI for simplicity and extensibility. Can be extended for mobile apps easily in the future
- API gateway to invoke the right services as well as manage Authentication. Extensibility for normal microservices can also be achieved easily with this approach
A high-level functional architecture for the Solution is shown below:
Datamart Design Considerations
- Create a Common Data Model with a common schema that can be extended dynamically
- Master list of columns to be created. For each column list, probability of mapping names to be created
- User will be able to configure column list for each insurance carrier from above master column list. At this point, it might be possible to add category/sub-category details at column/s level, so that user can use this for slicing and dicing in the future
- When a file from a specific carrier is loaded, source file columns’ will be compared
- If the columns are matching, it will consider that the “mapping” is successful. If column/s are not matching, it will move new column to the Exception log, where human intervention is expected to approve
- UI will be provided for user to intervene and approve any new columns (for each carrier). This will add to the Master list of columns as well as to the insurance carrier specific metadata
- Loading of files will have 2 stages. When the columns are 100% matches, the data will automatically be loaded. User intervention will be needed when there is any deviation
Workflow
A simplified workflow for a typical user is shown below:
Solution Architecture
Based on our initial understanding, an indicative Solution Architecture is visualized in the following diagram. Please note that as the project is started, there will be a more detailed evaluation of the needs and an updated Architecture will be developed.
If an exception such as the below occurs, then the exception is routed through a separate handler and is displayed for the admin to determine what action should be taken.
Potential types of exceptions:
- Metadata changes – additions, deletions, updates to existing metadata (such as column names, order of columns etc.)
- Value Changes – changes in how values are sent in the file (such as full values for state name instead of abbreviations)
For metadata changes, the admin will be able to add new columns from the UI. Deleted columns are typically ignored to avoid loss of data previously captured (no changes are made to the Common Data Model, but ingestion file is updated to ignore the missing columns).
The red lines/arrows represent a typical exception workflow.
The core considerations for the above architecture are:
- Python or similar language for file processing, data extraction and mapping to the storage schema
- Use of Lambdas for the following reason:
- The data ingestion is typically done infrequently, so use of other architectures like an API model would be operationally expensive
- The platform needs to perform Data-intensive processing with very less workflows and business logic, so a microservice architecture would not suitable
- Exception handling can be achieved via flat files with context or posting to SQS -> SNS. An exception handler will need to be wired to the UI for approvals
- All ingested files to be stored in an S3 Bucket for future reference before any processing
- Using AWS Aurora or other DBs preferred by Leonardo247 for Relational Data storage helps in reducing license costs and operational complexity, In future, it is easier to push the data into Redshift for large datasets
An indicative timeline based on the scope and complexity is outlined below. The timeline is liable to change during the project based on the complexity, changes in business requirements, technical requirements – as we complete workshops and review the user stories, we may present an updated timeline.
All necessary access to Leonardo systems, Detailed user stories [with Definition of done], along with all file types, Business rules, current and future reports, existing documentation and any other information identified by Qentelli will be provided before the project start date. Any delays from Leonardo in providing requirements, content, clarifications, approvals, UAT, defect prioritization or any other activity may lead to delays in the schedule – such delays will increase the estimated pricing as well.
This section highlights our delivery approach and all the vital activities that we do in terms of phases.
A standard Team (squad) has members from Product/Business, Project Manager/Scrum Master, Architect, Developers, SDETs/QA and Ops
Members of this team can be from Leonardo or Qentelli, based on functional and technical knowledge
Qentelli will staff the Architecture, Engineering and QA roles, while Leonardo will provide Product Ownership/Functional SME, Infrastructure and Operational support
The Team structure for this project from Qentelli
Solution Architect | Offshore |
UI Designer | Offshore |
Web Developer | Offshore |
DB Developer | Offshore |
QA Engineers | Offshore |
Continuous Development and Integration
At Qentelli our software developers with extensive knowledge on the latest technologies and are well versed in DevOps, Continuous Integration and Continuous Delivery. The image below describes the various activities that will be part of the development phase. We follow the below core principles for our development practices:
- Test-Driven Development
- Rigorous, regular refactoring
- Continuous integration
- SOLID principles for Design/Development
- S - Single-Responsibility
- O - Open-closed
- L - Liskov substitution
- I - Interface segregation
- D - Dependency Inversion
- Pair programming
- Single Repository
- Secure Development Practices based on guidelines from Microsoft and OWASP
- Git based branching strategy
Pipeline Orchestration:
A completely automated CI Pipeline can significantly reduce the time needed to move a unit of code through different stages and environments. The following diagram shows the typical Pipeline that is fully automated till deployment to Production.
NOTE: Due to time and cost considerations, fully automated pipelines will not be available in Phase 1.
Program Delivery & Governance
Program Delivery Model
Our approach to Agile software development consists of the 12 principles from the Agile Manifesto for a successful delivery of each increment of the end-product. This approach enables Leonardo247’s change in requirements at any state in the development cycle promoting customer satisfaction at the forefront. We make sure to enable continuous collaboration between our teams and the stakeholders at Leonardo247.
We will follow the philosophy of SCRUM development lifecycle and the various phases of the project management are described below.
User Story Creation: User stories are short, simple descriptions of a feature told from the perspective of the person who desires the new capability, usually a user or customer of the system. They are written throughout the agile project. Usually a story-writing workshop is held near the start of the agile project.
Governance Model
The Governance Model is a baseline of key elements that are required for project governance based on the project's scope, timeline, complexity, risk, and stakeholders. A communication plan will be developed once all the stakeholders have been identified and their interests and expectations have been defined.
A well-formulated communication plan delivers concise, efficient and timely information to all pertinent stakeholders. A high-level overview of the Governance layers we follow is outlined in the diagram below:
Pricing
The proposal will be executed in Time & Material delivery model with overall pricing estimated at:
The proposal will be executed in Time & Material delivery model with overall pricing estimated at:
Indicative Pricing | $ 101,112 |
1-Time Discount | $ 8,088 |
Final Estimated Pricing | $ 93,000 |
Note:
- This proposal will be executed on Time and Material pricing model – as scope and design details are uncovered during the project, time and cost estimates will be updated accordingly
- Any major change to estimates of time and cost will be approved by Leonardo
- Offshore refers to Qentelli’s offices or remote based in India
- The above pricing is indicative, based on an 8-hour workday for 22 days a month on average. The monthly invoice submitted to Leonardo may vary each month based on the actual number of working days [holidays, PTO]
- The above pricing is for Professional services only – all licenses and infrastructure costs will be Leonardo’s responsibility
Dependency | Support needed from Leonardo247 | Stakeholders in Leonardo247 |
Knowledge Transition for Qentelli team | KT workshops on
| Platform Architect Platform specialist Solution Architects |
Access to workspace, systems, servers, for Qentelli personnel in the US and India |
| Leonardo247 IT Team Product Team approvals |
Feedback for all deliverables and approvals to proceed | Timely review, feedback and approvals for deliverables from Qentelli | All Leonardo247 stakeholders including but not limited to:
|