ODBC SQL Server

Connectivity Summary

An out-of-the-box connector is available for SQL Server databases via ODBC. It provides support for  crawling database objects and profiling sample data. The connectivity to SQL Server is via ODBC APIs  (another .NET application for deployment which connects with SQL Server and returns data to OvalEdge).

Note: Running of .NET Project is mandatory to connect with SQL Server via ODBC APIs.

 

Prerequisites

The following are prerequisites required for the connection of the ODBC SQL Server database.

Sl.No

Item List

Description

1

API

An additional .NET project has to be deployed wherever this connector is

given to connect with SQL Server.

User Permissions

The following are minimum permissions and privileges required for the user to crawl the database objects, and profile sample data. 

OE ADMIN - Superuser.

The following are the minimum permissions required for OvalEdge to validate the ODBC SQL Server connection. 

Permission: READ

Roles: Crawler Admin 

Super User: OE ADMIN

Note: Navigate to Configuration  > Users & Roles for roles and permission.

Technical Capabilities

Features

Supported Objects

Remarks

Crawling

Tables, Table Column, Views, Function, Procedure, Synonym

Supported Data types:

Bigint, Char, Date, Datetime, Datetime2, Decimal (P, S),

Float, Int, Money, Nchar, Ntext, Numeric (P, S), Nvarchar,

Real, Smalldatetime, Smallint, Smallmoney, Time, Tinyint,

Varchar

Profiling

Table Profiling

The output will be Row count, Columns count, 

and View sample data

Views Profiling

The output will be Row count, Columns count, 

and View sample data

Note: View is treated as a table for profiling purposes.

Column Profiling

The output will be Min, Max, Null count, distinct, top

50 values

Full Profiling

Supported

Sample Profiling

Supported

Query Execution

Select

Supported

Insert

Not supported, by default.

Update

Not supported, by default.

Delete

Not supported, by default.

Joins within database

Supported

Joins outside database

Not supported

Aggregations

Supported

Group By

Supported

Order By

Supported

 

Note: By default, the service account provided for the connector will be used for any query operations. If the service account user is having write privileges, then Insert / Update / Delete queries can be executed.

 

Connection Details

To connect to the ODBC SQL server using the OvalEdge application, complete the following steps.

  1. Login to the OvalEdge application
  2. Navigate to Administration > Crawler module.
  3. Click on the + icon, and the Manage Connection with Search Connector pop-up window is displayed.
  4. Select the connection type as ODBC SQL. The Manage Connection with ODBC SQL Server specific details pop-up window is displayed.
  5. The following are the field attributes required for the connection of ODBC SQL Server.
  6. Once after entering the connection details in the required fields, Click on the Save button or Save & Configure to establish the connection and configure the connection settings. Here when you click on the Save & Configure button, the Connection Settings pop-up window is displayed. Where you can configure the connection settings for the selected Connector.

Note: The Save & Configure button is displayed only for the Connectors for which the settings configuration is required.

 7. Click on the Validate button the entered connection details are validated. 

Note: It is up to user's wish, you can save the connection details first or you can validate the connection first and then save it. 

Error Validation Details

The following are the possible error messages encountered during the validation. 

Sl.No

Error Messages

Description

1

Error while getting Access Token::13.58.: Name or service not known

Invalid Server

2

Error while getting Access Token::connect timed out

Incorrect Port

3

Error while getting Access Token::connect timed out

Invalid Database

4

Error while getting Access Token::connect timed out

Invalid User Name

5

Error while getting Access Token::connect timed out

Incorrect Password

Connection Settings

Crawler

Crawler setting has various settings tabs for crawling and profiling options. The crawler options are available for all the connections. Based on the selected connection the options will differ. It is required to enable one of the crawler options.

4-Nov-08-2022-09-27-33-9663-AM

Crawler options

Tables, Views, and Columns: This crawling will discover the tables, views, and Columns and bring them into OvalEdge. This is the Default option for crawling

Procedures, Functions & Views Source Code: The crawling will discover the procedures, functions, views source code within the schemas selected, and bring them into OvalEdge. 

Relationship: This crawl job will look at the schemas selected and determine relationships between the tables and bring this information into OvalEdge.

Crawler Rule

When setting up the regex rules the user will be able to write rules that will either include and/or exclude schema, tables, views, columns, procedures, and functions that start with, end with, or have middle characters as defined in the rule.

 

Profiler

Profiling a data source also helps in identifying relationships between the tables at the entity level and patterns between them. Many attributes can be specified in the profile settings.

The attributes are as follows, 

  • Order: Order is the sequence in which the profiling is done. 
  • Day: the day of the week in which profiling is set to run.
  • Start/End Time: the start and end time which profiling is set to perform.
  • Number of Threads: a thread is a process where a query is executed on a database to do single or multiple tasks. The number of threads determines the number of parallel queries that are executed on the data source. 
    • Profile Type: There are four main types of data profiling. 
      • Sample Profiling runs the profile on a given sample size. The data on columns (like Min, Max, Distinct, Null Count, etc.,) will be different when compared with full profiles as we calculate them only on sample size. The sample profile is based on two main values. To execute a sample profile, first select the Sample profile type as “Sample” and enter a sample profile size (count of records to be profiled).
      • Auto and if the table row count is more than the Rowcount Limit configured in the setting. The sample size should always be less than Rowcount Limit when Profile Type is Auto, and Row Constraint is checked as True. When the profile type is set as Auto, it first runs as a Query type, and later it transfers the type from Query to Sample if the conditions are not met. In Query mode, when you check Rowcount Constraint as TRUE and if the table row count exceeds the Rowcount Limit, the profile type is automatically transferred from Query to Sample. 
      • The Query profiling is when the Rowcount Constraint is checked as TRUE and if the input table row count is less than the Rowcount Limit, then the profiling is executed on the entire table. If the input table row count exceeds the Rowcount Limit, then the profiling skips execution for those tables to avoid performance issues. A profile type set to “Query” will always depend on the Rowcount limit and the row count constraint must be set as “True”. 
      • Finally, there is the Disabled profile type, which prevents profiling on the selected data source. 
  • Row Count Constraint: when set to true this enables the data rule profiling. 
  • Row Count Limit: number of rows of data to be profiled. 
  • Sample Data Count: total number of rows to see within the table data page in the Catalog.
  • Sample Profile Size: total number of rows to be included in profiling.
  • Query Timeout: length of time in seconds to allow the query to run on a remote database before timing out. 
 

Note: Once the setting for profiling has been configured, go back to the Crawler screen and click “Crawl/Profile” to begin the profiling.

5-Nov-08-2022-09-34-36-4088-AM

 

Note: Profiling will be successful if “All” and  “Current Day” are matched.

Remote Access

Remote Access (Data Access Authorization) is included in the crawler-specific connector settings to ensure that the right user is accessing the query sheet and queries in the data catalog. Here, the system validates the user credentials and allows that user to access the query sheet and queries in the data catalog.

  • OvalEdge follows OvalEdge data permissions: When this option is selected, the OvalEdge application uses a service account used for creating the connection in the Query Sheet page, Data Catalog > Table > Data page and Data Catalog > Query page to fetch the required data.
  • OvalEdge follows Remote system permissions: When this option is selected, then the users are prompted to enter the remote data source credentials in the Query Sheet page, Data Catalog > Table > Data page and Data Catalog > Query page, and the submitted credentials are used to establish the connection to fetch the required data. Here, these credentials will be cached and cleared after 7 days of inactivity.
    6-Nov-08-2022-09-40-44-6276-AM

Query Policies

  1. Click on the Query Policies tab, the data grid with existing Policies details with Role, Query Type, Access Type and Delete columns are displayed. 
  2. To include new query policies, click on the +Add Query Policy button.  
  3. Select the desired Role, Query Type and Access Type from the respective dropdowns and click on the Save Changes button

If the user selects the Role as “OE_HRADMIN”, Query Type as “JOIN” and the Access Type as “DENY”, then the users associated with the OE_HRADMIN privileges are restricted to use the JOIN function in the Query Sheet page.

7-4

Access Instruction

It allows the Crawler admin to write the specific instructions on the data source/connector.

7-Nov-08-2022-09-51-48-2598-AM

Other

When you navigate to Others tab, the Send Metadata Changes Notifications to section is displayed. 

9-2

Send Metadata Notification to:

  1. Select whether the notifications for the Data Owner and Data Steward under the specific Roles need to be sent or not.
  2. Select the desired role from the Roles dropdown menu and click on the Save Changes button. The notifications will be successfully sent to select Data Owner and Data Steward.