Articles & Case Studies

  • Data Warehouse Platform Services (0) October 30, 2014


    KPSoft administrators will collaborate with CLIENT production environment staff, management team and customer representatives for supporting the operational environments for EDIM. All preplanned activities will be communicated in advance. We will provide documentation highlighting the list of activities that will be performed during the initial communications and subsequently provide complete documentation with step-by-step procedures and administrative guides for conducting the activities. For all issues identified, our technical staff will conduct an impact analysis, find the root cause and then communicate list of steps to the production staff for addressing the issue. Our high level steps for providing the operational support are as follows:

    Operational Activity Support Procedures
    Platform Availability Develop SLAs (e.g. high availability) for each component of DATA WAREHOUSE AND ANALYTICS, develop documentation and support configuration, setup monitors and alarms, and provide support as described above.
    DATA WAREHOUSE AND ANALYTICS Technical Forum KPSoft representatives from our architectural council will provide leadership to the technical forum; Collaborate with other stakeholders on the forum for governance and enforcement of SLAs on the DATA WAREHOUSE AND ANALYTICS platform; create wikis using Google Sites for documenting blogs, knowledge articles, tips and tuning steps for each significant component of DATA WAREHOUSE AND ANALYTICS, review implementation plans with the forum and provide support for executing them. KPSOFT’s AGC facilitated communication between stakeholders and provided leadership for USDA FS’ VIPR BI project.
    Platform Performance KPSOFT administrators will configure auto scaling features (e.g. AWS) for increasing computing resources during peak and normal hours in an elastic fashion; Issues will be address as discussed above.
    Impacts Assessment Impacts will be analyzed by our technical staff, dependent steps will be developed, risks will be analyzed, and resolution will be executed.
    Platform Upgrade/Patching Quarterly, security and major/minor release patches will be reviewed, dependencies will be highlighted and executions steps will be documented.
    Platform Bug Fixes All reported bugs will be tracked in our configuration management, development teams will provide fixes, QC team will test and release them to production as emergency fixes or minor releases.
    DATA WAREHOUSE AND ANALYTICS Platform Vulnerabilities All tiers (database, application server and application) will be tracked for vulnerabilities using web app, O/S and RDBMS scanning tools. All reports will be analyzed, fixed and released to the production teams.
    Security Documentation KPSOFT will comply with the CLIENT IT Security Office requirements, develop documentation and implement controls as needed.
    Audit Support KPSOFT will comply with the CLIENT audit support requirements.
    Configuration Management KPSOFT will subject our development and QC environments to the same levels of security, major and minor release patching of the vendor software; KPSOFT will follow NIST and other CLIENT federal security guidelines.
    Security Assessment & Authorization KPSOFT will comply and provide support to the CLIENT ATO A&A activities. We will coordinate with CLIENT IT Security Office and conduct A&A on KPSOFT’s internal environments, identify and mitigate issues in advance.


  • Logical and Enterprise Data Warehouse Strategy (0) October 30, 2014

    Does your organization have multi-department, multi-dimensional, multi-structured or a combination of structured and unstructured data assets.

    This multi-org, multi-tenant scenario will bring forth a diverse set of information assets, business rules that need to be collected, analyzed, and reported on for business intelligence. Traditionally in large enterprises such efforts have been addressed using a single Enterprise Data Warehouse (EDW) approach where a collection of operational data stores and data marts that are formally developed, well vetted and setup for data analysis. However, as technology in the enterprise evolved the business entity footprint is now represented in multiple formats such as structured (e.g. RDMBS data), un-structured (e.g. pdfs) and semi structured (XML documents, Excel CSV). Establishing and changing EDW (extensive modeling of interdependent hierarchical models) with constantly changing inputs is cumbersome, time consuming and expensive. Using a Logical Data Warehouse (LDW) is a more economical and quick turnaround approach for data that is subjected to change and heterogeneous. LDW is an architecture pattern that federates and virtualizes data from multiple data sources (independent at their taxonomical entity level) for gaining business intelligence.



    KPSoft has successfully implemented both LDW and EDW for our large enterprise clients and will apply these methods for enabling D2D. KPSoft recommends establishing and governing interoperability between EDW and LDW and thus providing a unique data access layer to the integrating BI applications. Please refer to the diagram below for our recommended reference architecture.

    Please find below the list of implementation steps and best practices that KPSOFT recommends for establishing D2D architecture for the long term and increased ROI:

    • Conduct metadata and logical modeling – Use a bottom up analysis of business functions, existing definitions of business entities and OLTP databases to arrive at a comprehensive set of information assets that will ultimately define the predictive and analytical BI space.
    • Develop physical models for EDW and LDW – Apply industry best practices for installation and configuration of operational data stores, star schema, fact tables for EDW and heterogeneous data stores that are in 3rd Normal Form (3NF) for LDW.
    • Develop ETL for establishing data movements between the sources to EDW and LDW as required. These ETLs will also facilitate data movement between EDW and LDW for their interoperability toward providing a unified data source to other tiers of D2D.
    • Establish Service Level Agreements (SLAs) – Conduct a thorough JAD style sessions with various end user communities, software vendors and infrastructure stakeholders to establish a hierarchical set of SLAs. Top layers will include a common core set of standards and SLAs across the customer agencies, while the bottom layers will include SLAs specific to the agency. SLAs will then be mapped to reliability, fault tolerance, scalability, tuning and performance of the D2D components (MicroStrategy, Fuse etc.).
    • Configure User Access Layer – Create a single user interface layer for all BI applications to be connecting to. Security policies that are in compliance with your organizational information security regulations, personalization and customizations will be applied to this layer.
  • OAM OIM Intranet/Extranet User Separation (0) October 5, 2011

    Every large organization that has extranet presence and that employed Oracle Identity Management would like to organize their LDAP directory structures differently. This post provides an overview of how this can be achieved. For more details, please contact us at The configuration varies based on a cluster or single server infrastructure, and how relevant access managers are laid out. This post focuses primarily on the underlying OID/OVD configuration and their linkages to OAM/OIM. what we have is two oams, two oims, single oid all replicated in a weblogic cluster (two each). Please click here for setup and configuration.

  • Oracle BI Publisher and OIM ( integration for identity usage reports (0) October 5, 2011

    We worked on Oracle OIM implementation and BI publisher for generating application’s identity and access usage reports. Please find the details posted here.

  • Continuous Integration (0) August 23, 2011

    There is a lot of discussion out there about pros and cons of continuous integration (CI) practices. Key advantage is that this sort of approach could be used as an early warning system for issues with your code. Since it is fairly easy to setup, the disadvantages if any are trivial at least in my mind and not worth discussing until you have implemented at least one set of CI tools. Please refer here for a set up that I implemented recently that took me less than an hour.