Saschinvoke is the command line utility in obiee to invoke Jobs in obiee scheduler. Typical use of this command involves invoking cache seeding ibots at the end of nightly etl loads.
in 10g, this command was invoked by passing the username of administrator and password as parameters.
$saschinvoke -u Administrator/password -j 1
In 11g, passing password as a parameter does not work. The command prompts for password and the user has to enter it to successfully invoke the jobid. This behavior will cause trouble in automating the cacheseeding scripts. The below work around addresses this issue.
$echo password|saschinvoke -u weblogic -j 1
Note: note the use of pipe character |. No space between | and the password.
Hope you find this useful
Business wants to leverage the existing MS Active Directory for Authentication in OBIEE 11g. In addition to this IT wants flexibility in adding users like DEV1, QA1 etc for system development without adding these users in the enterprise Active Directory. This means basically two Authentication Providers 1. MSAD and 2. The Out-of-the-Box WLS LDAP
In 10G, we had the option of integrating MSAD (Microsoft Active Directory) in OBIEE RPD and also users like Dev1, QA1 could be added in RPD. However in OBIEE 11G while LDAP integration is still available in RPD, the recommended way is to do this in the Weblogic Server. Adding Users and Password in RPD is not possible in 11G
Going by the Oracle Documentation there is a lot of confusion w.r.t this subject. This blog is the answer.
This solution assumes you have successfully installed OBIEE 22.214.171.124 on Windows XP and the SampleAppLite application is up and running. The default user weblogic should be able to login to OBIEE. In Active Directory a user by the name ‘BISystemUser‘ has to be created. – This is very important. Once the Pre-requisites are done, please proceed with the below steps.
(This solution is taking the simplest route i.e rather than creating a new TrustedUser to replace ‘BISystemUser’ , I created a user with the same name ‘BISystemUser’ in MSAD. This will avoid finding and replacing all the occurrences of BISystemUser with a different TrustedUserName)
1. Logon to WLS (WebLogicServer) and navigate to Security Realms->myrealm->Providers Tab
2. Click on Lock&Edit from the left Pane and then click on ‘New’ button to add a new Authentication Provider.
2.1 Supply the details as shown below
2.2 You should now be able to see the new Authentication Provider as shown below
2.3 Click on MSAD and in the new screen set the Control Flag:SUFFICIENT
2.4 Click on ‘Provider Specific’ tab and input the MSAD Specific Details
The main entries in the above screen are listed below (NOTE: This varies based on your company’s specific details)
User Base DN: CN=Users,DC=corp,DC=YOURCOMPANY,DC=com
All Users Filter: (&(sAMAccountName=*)(objectclass=user))
User From Name Filter:(&(sAMAccountName=%u)(objectclass=user))
User Name Attribute:sAMAccountName
User Object Class: user
Group Base DN:CN=Users,DC=corp,DC=YOURCOMPANY,DC=com
You can copy the existing entries from your 10G RPD, if you had set this up before in 10G.
The main things to note here would be ‘Principal’ which was known as Bind User in 10G. Also change the User Name Attribute to sAMAccountName. Make sure this change is also made in ‘All Users Filter’ and ‘User From Name Filter’. I also changed the Group Base DN. Having done this, now you should be able to see the users from your AD in the ‘Users and Groups’ tab. If you are not able to , then you didn’t enter the entries listed above correctly. Try changing it, also try it out within the RPD to make sure all the entries are supplied correctly.
3. Now in the ‘Providers’ tab click on the ‘DefaultAuthenticator’ and change its Control Flag to OPTIONAL
4. Reorder the Authentication Providers so that ‘MSAD’ comes as the first.
Click on Activate Changes. You might need to restart the WLS.
5. Logon to WLS (myrealm->Users and Groups tab)and make sure you see two ‘BISystemUser‘, The first one belongs to the defaultAuthenticator and the second one is from the MSAD. Delete the BISystemUser of DefaultAuthenticator
6. Logon to Enterprise Manager (EM) and goto WebLogic Domain->Right-click on bifoundation_domain to reach Security->Credentials
6.1 Update the password of BISystemUser to the value set in MSAD
6.2 Goto WebLogic Domain->Right-clicked on bifoundation_domain to reach Security->Security Provider Configuration.Click on the ‘Configure’ button under ‘Identity Store Provider‘. Add three Properties as below
user.login.attr=sAMAccountName, username.attr=sAMAccountName, virtualize=true
NOTE->If virtualize=true is NOT added, then only MSAD users will be able to login. (since MSAD is the First Authenticator)
7. Restart WLS and restart BIServer Components from EM.
8. Thats it. Now the MSAD users and the WLS LDAP users will be able to login.
How can I find the physical column behind each column in Answers. How can I know the business logic used to compute the metrics. These type of questions mostly arise form the BI Analysts who were not part of implementing the OBIEE Solution.
There is a new feature available in OBIEE 10.1.3.4.1 called Metadata Dictionary. Implementing this feature will alllow users of OBIEE to trace down a Presentation Column in Answers to the Logical Column and finally to the Database column
Metadata Dictionary in OBIEE 10.1.3.4.1
- Logon to the RPD in Offline mode and generate the metadata using Tools->Utilities->Generate Metadata Dictionary. Specify a directory when Prompted to store the dictionary. A folder named ‘dictionary’ will be created in the specified location. The ‘dictionary’ folder will contain a sub-folder with name =
- Go to your Analytics Virtual Directory and place the ‘dictionary’ folder generated in step 1 inside it. If you are using OC4J, the virtual Directory would be C:\OracleBI\oc4j_bi\j2ee\home\applications\analytics\analytics\res
- Restart your Servers and Go to Answers. You will see a new Icon to the right of every Subject Area. The same Icon will also appear besides each folder.
Click on the Icon and a new window will open displaying the metadata. Drill down on a specific item to navigate to the details
” What does this data mean?
What exactly is captured in this metric?
How is it calculated?
Who owns the formula, Who Owns the definition?
As a Business Intelligence Professional these questions would be asked frequently by the end users of your application. On top of your head for some of the columns/data you know the definition and for others you might need to get back to them after digging all the requirements and design specs (if any). It could be a bit embarrassing when you have classy dashboards and reports and you are stuck when an user during a demo asks “Hey what exactly is this?”. Data Dictionary is the solution.
Having a Data Dictionary available on demand or on the dashboards is a huge step forward in getting the users adopted to your application. There is no ‘Out of the Box’ Data Dictionary solution available with OBIEE. In this blog, I will share how I have implemented a Data Dictionary and exposed it in Dashboards/Answers by leveraging the existing OBIEE infrastructure.
At a high level,store the definitions in a file, load the file contents into a Database table, import the Table into OBIEE RPD, expose in Answers, create data definition reports.
Step 1: The Data Definitions for each column has to be stored in a Text file (tab delimited). This file should contain the columns exposed in Dashboards/Reports and their definitions. The file I made have the following 5 columns
1. Column Name (The name of the Presentation column)
2. Column Synonym ( Is this column known by any other names? If so populate this column as well)
3. Data Definition (The Business Definition of the data contained in the column)
4. Definition(contd..) (Make use of this field to explain more details about the column)
5. Subject Area (Presentation catalog Name)
Depending upon your specific business needs you can add more columns. The reason why I chose tab delimited file instead of CSV is that, the definitions will contain commas and this could be problem for .csv loading using ETL or other tools
Step 2: Once the file is ready, populate a Database table sourced from the file, import to RPD and expose in Answers in a new Subject Area called ” Data Definitions” or Data Dictionary” or whatever you would like to name this.
Step 3: Create Data Definitions Report for each Dashboard Page. Ex. If you have 5 columns in a dashboard, create a Report with the Data Definition Columns of Step 1 and make use of the request filters so that only the columns shown in a dashboard page exists in the report.
Step 4: Expose this report in the dashboard page as a link.
Simple Solution. Highly Effective. My Business users love it
In this blog I will pen down some of the best practices to follow while implementing OBIEE solutions. This blog talks mostly about the technical practices applicable to OBIEE Repository Modeling, Dashboards and Requests Design. I will talk about the data modeling in a separate blog.
- Try to always import tables and columns into Physical layer rather than creating it manually.
- This will ensure correct data types are set for each column. This is particulary useful when there is confusion between DATE and DATETIME
- For each Physical Dimension table there should be a Primary Key and only one. For Fact Tables, there is no need to create a Primary Key.
- If only composite key is present create a single Physical key and add all the composite key columns in it.”
- Minimize Opaque Views (SELECT statements) in Physical Layer.
- Create Tables (recommended) or Materialized views in data-warehouse instead
- Always use Foreign Key Joins in the Physical layer. Avoid using complex joins with conditions. Complex joins are not good for performance and should be avoided. (there are a few exceptions for this case when we work with Type 2 SCD)
- Always try to use Number-Number join. This will work faster than a varchar-varchar join.
- Avoid using CAST functions in the join expression. This will destroy the usability of the Database indexes created on that column.
- Avoid any filter conditions in the Join.
- These filter conditions can in turn be added in the LTS (Logical Table Source) ‘Where’ clause content filter or as request filter in Reports
- Facts should not be joined . This will result in Cartesian Product leading to double counting and summing.
- Use conforming Dimensions instead
- Connection Pool considerations (15-18)
- Require fully qualified table names should be unchecked
- Enable Connection Pooling should be checked
- Execute queries asynchronously should be checked
- Create a separate Connection Pool for Initialization Blocks
- Keep Cache persistence time of all tables as Infinite
- The columns used in Joins should be set to “NOT NULL”
- The database Features tab should be set correctly with the Parameters supported by your backend database.
- If both are not in-sync then lot of processing will be done in the BI Server instead of the Database. This affects Performance. Pay particular attention to Locale. (They are case-sensitive).Mismatch of Locale can cause the sorting to be done in OBI Server instead of DB and performance take a bad hit !
- DERIVED_TABLES_SUPPORTED in database features tab should be checked for Oracle Databases. This will ensure that Proper function shipping will happen to the DB in case of TOP(N) and Rank functions
- Create Display folders to group tables according to STAR or Releases
- Set Different Icons on objects for each Release of the Code. This will ensure in finding which entity was added in which release
- Don’t Leave the Description field empty. Write some meaningful descriptions of the object. This will help a lot in later trouble-shooting and Impact Analysis
- Minimize the use of Snow-Flakes. Always go for Star Schemas.
- Always use Complex joins here. It allows OBI Server to make best decision about the exact Physical SQL to be generated based on Logical Query Path. In contrast to a Physical FK join, which forces a single join path between tables. If joined tables were dragged from Physical Layer, replace FK Joins with complex Joins
- Create Dimension Hierarchies for every Dimension in the Business Model
- Even if a meaningful hierarchy definition cannot be thought of, just create one with the Grand Total Level and Detail Level. “
- For Dimension Hierarchies the ‘Number of Elements at this level’ should increase from 1 at Grand Total to the corresponding distinct values at each level. This can be approximate values; need not be the exact ones.
- Define Keys at each level of the Hierarchy.
- The Content tab of each of the LTSs in Fact should be set to the related Dimension’s Logical Level
- Combine all attributes that describe a single entity into a single Logical table
- Never Delete logical columns that map to keys of Physical dimension tables
- Don’t keep unwanted Physical columns in the Logical Layer
- Give Meaningful Names to the Logical Columns. Avoid assigning a logical column the same name as a logical table or Business Model object.
- Make proper use of the where clause Content filter of the LTS to minimize number of records returned.
- Minimize the use of Conditional Checks and ‘CASE WHEN’ usage in the formula of Logical Columns. This will affect performance. Instead make proper use of the where clause Content filter of the LTS if the condition applies to all the columns/measures in the logical table
- When Creating a logical column based on other logical columns , make sure all the columns in the expression is from the Same logical table, same LTS
- Make proper distinction between Count and Count Distinct. If you are counting on a unique value column don’t use Count Distinct. This will affect performance
- Minimize the use of Outer joins within LTS. This is resource consuming. Use default zero ROW_WID records at the database instead.
- Make sure a particular Report only refers one LTS in a Logical Table. Or the different LTSs should be at the same level
- Avoid dimensions in Fact tables and avoid measures in Dimension Tables
- Create Display folders to group tables according to STAR or Releases
- When using Out-of-the -Box Vanilla RPD, remove unwanted Logical Tables and Hierarchies. This will minimize the time needed for Consistency Check
- Specify the most Economical Source when there are multiple LTSs for a Dimension
- Whenever you do Consistency Check, Right Click the Changed Business Model Object and go for Check Consistency rather than using the Global Consistency Check. This will minimize the time needed for Consistency Check
- Arrange the logical columns alphabetically. This will save time when you revisit.
- Fix the warnings if any, don’t ignore it
- Catalog should map to one BMM Object only
- Use Parent Folders and Sub folders to group Facts and similar Dimensions together
- Avoid the use of Aliases when a new Presentation Column is created
- The Presentation Columns in a table should be sorted alphabetically if no specific order is asked by the customer
- Get Customer Sign-off of the Presentation layer structure before building reports. This will avoid later replacements of columns which affects the reports constructed.
- Make proper use of the Permissions in this layer
- Don’t use Double quotes (“) in Column name, though its permitted
- Presentation columns should not have the same name as Presentation Table.
- Eliminate unneeded objects to reduce user confusion
- Limit # of objects in folder to 7-12
- Use Object description field to convey information to users when they hover the mouse in Answers on a Presentation column.
- Keep names short to have space on reports