• COMPAREX

    COMPAREX
    i2GO

COMPAREX i2GO

Comparex i2GO is an analytics security solution for multi-source data analysis including non-structured information and data from various sources such as Internet sites, social media, electronic docs and graphics etc. It covers over 100 types of sources. Comparex i2GO searches the requested data and prepares it for the analytics process, based on the engine of IBM i2. Additionally the Comparex i2GO brings set of the predefined templates and analytic models which provides almost ready-to-go implementation.


Comparex i2GO provides advanced and comprehensive tools for analysts and other users who generally works on multi-source data investigations. Comparex i2GO brings:

  1. More than 100 data source types compatibility
  2. Real Time data access with powerfull on-demand caching of source data
  3. Easy access to cloud data sources including social networks like Facebook or Twitter
  4. Easy access to mailboxes hosted on Gmail, POP3, IMAP, and SMTP
  5. Searches on structured and unstructured data
  6. Analysis results presented using any reporting tool
  7. Easy to use, self-service web interface for configuration and management
  8. Seemless connection to Cognitive Services (soon Watson analytics) accessible from SQL, allowing usage of cutting edge Artificial Intelligence services from any analytical tool
  9. Quick data extraction for offline analytics, even for big data sets. Easy portability of backend storage.

Combined with the IBM i2 product family delivers full functionality for the intelligence services of the public, finance and insurance customers.
 
 
 
 

 

EXAMPLES OF SCENARIOS

  • Combining data from local hard drive, facebook and email accounts under one, common interface with centrally governed security, plugged into i2 using already existing i2 connectors.

  • Well-known SQL interface allows analyst to define a set of rules to search data sources without getting into technicalities of specific data source. Rules can be reused later even against data sources that did not exist at the time rules were developed.

  • Elastic architecture allows for on demand develo pment of specialized data providers, for example integration with OCR software or scrapping data from web pages.

  • Physical separation of data processing interfaces: Internet facing interface extracts and materializes data, that is further moved to secure, isolated place for analysis.

  • All connected data sources can be searched for specific artifacts like documents, photos, and then content can be processed automatically using specialized Artificial Intelligence service to identify suspects on photos, movies, identify specific wording in documents and automatic transcripts, or personality insights based on how the person writes.

 

Kontakt

Inne tematy