How to display data in SSRS Report from multiple datasets

Since SSRS 2008 R2 multiple datasets can be joined in a single data region on the report. As a result of the expression field we can display the first matching value for the specified name from secondary dataset that contains name/value pairs.

SQL Reporting Services comes with three lookup functions having four parameters as given below

  1. Lookup (source_expression, destination_expression, result_expression, dataset)
  2. LookupSet (source_expression, destination_expression, result_expression, dataset)
  3. MultilookUp (source_expression, destination_expression, result_expression, dataset)


LookUp Functions Parameters

  • source_expression. Evaluated in the current scope and that specifies the name or key to look up. For example, =Fields!AgentID.Value.
  • destination_expression. Evaluated for each row in a dataset and that specifies the name or key to match on. For example, =Fields!Agent_ID.Value.
  • result_expression. Evaluated for the row in the dataset where source_expression = destination_expression, and that specifies the value to retrieve. For example, =Fields!TeamLeader.Value.
  • dataset. Name of a dataset in the report. For example, “Agents”


Lookup LookupSet MultiLookup
Retrieves the value from the specified dataset for a name/value pair where there is a 1-to-1 relationship. Retrieves multiple values for a single name or key field where there is a 1-to-many relationship Retrieves a set of values from a dataset for name-value pairs where each pair has a 1-to-1 relationship
For example, for an ID field in a table, you can use Lookup to retrieve the corresponding Name field from a dataset that is not bound to the data region For example, for a customer identifier in a table, you can use LookupSet to retrieve all the associated phone numbers for that customer from a dataset that is not bound to the data region For example, for a multivalue parameter that is based on primary key identifiers, you can use Multilookup in an expression in a text box in a table to retrieve associated values from a dataset that is not bound to the parameter or to the table.
Returns the result expression value Returns the set of result expression values

Because LookupSet/MultiLookup returns a collection of objects, you cannot display the result expression directly in a text box. You can concatenate the value of each object in the collection as a string

Use the Visual Basic function Join create a delimited string from a set of objects. Use a comma as a separator to combine the objects in a single line. In some renderers, you might use a Visual Basic line feed (vbCrLF) as a separator to list each value on a new line.

=Join(LookupSet(Fields!SiteGroupID.Value, Fields!ID.Value, Fields!SiteCategoryName.Value, “Category”),”,”)

=Join(MultiLookup(Split(Fields!SiteCategoryList.Value,”,”), Fields!ID.Value,Fields!SiteCategoryName.Value,”Category”)),”, “)

  • Evaluated after all filter applied
  • Only one level of lookup is supported. A source, destination, or result expression cannot include a reference to a lookup function.
  • Source and destination expressions must evaluate to the same data type. The return type is the same as the data type of the evaluated result expression.
  • Source, destination, and result expressions cannot include references to report or group variables.
  • Cannot be used as an expression for the following report items:
    • Dynamic connection strings for a data source
    • Calculated fields in a dataset
    • Query parameters in a dataset
    • Filters in a dataset
    • Report parameters
    • The Report.Language property


SSRS Lookup Example

SSRS Lookup report

Dataset1 includes an Agent Name identifier (agent_name_tpi) field. Dataset2 contains the corresponding AgentName and total calls answered AnsCalls.

In the following expression, Lookup compares the value of agent_name_tpi to AgentName in each row of the dataset2 and, when a match is found, returns the value of the AnsCalls field for that row.

Power BI – Analyse, interpret and understand your data

What is Power BI?

Microsoft Power BI is a cloud based self-service BI solution for the enterprise. It works from within Excel and Office 365. It allows users to get insight into virtually any type of data, and ability to interact with your charts, graphs and reports. It also serves as the “portal” or site for publishing, managing and sharing your BI reports and dashboards.


power bi solution

Power BI provides capabilities that can be leveraged by people with all level of skills including data analyst, power users, business users, data stewards and folks from the IT department.


Power Bi Evolution

The first release of Power BI, Power BI for Office 365, was launched in February 2014 as a cloud service to simplify the deployment and maintenance of the back-end servers. This initial Power BI release did not include support SQL Server Reporting Services or Analysis Services and, like the on-premises version, offered a limited set of capabilities and a disjointed workflow between components.

The new release of Power BI, which became available in July 2015, addressed the above challenges with

  1. Stand-alone, free, desktop authoring interface, independent of Excel Power BI Desktop so users can develop analytic content locally without license cost. It unified and streamlined the three distinct interfaces and capabilities of Power Query, PowerPivot and Power View.
  1. Access to on-premises enterprise data stored in SQL Server Analysis Services (SSAS) — tabular and multidimensional models


Soon to be released Excel 2016 will embed BI components rather than incorporate the capabilities through add-ons. Although Microsoft has indicated to Gartner that it is making significant investments in on-premises solutions that integrate with Power BI, it does not at the moment have an on-premises Microsoft solution for distributing Power BI content. Instead, it has initially partnered with Pyramid Analytics for customers that require on-premises publishing.  There are plans to support the publishing of SQL Server Reporting Services and Datazen content in Power BI dashboards in a near-term future release.


Power BI Tools

  • Power Query. To search, discover, acquire, combine, refine, transform and enrich data. Simplified ETL tool built into your familiar Excel to search or discover data from a wide variety of data sources including local to enterprise as well as from online public data sources

power query

  • Power Pivot. To create a data model on large amounts of data from virtually any source, define relationships for tables, create measures, KPIs, hierarchies etc. Power Pivot uses xVelocity in-memory engine for lightning fast analytic performance and empowers Excel users for powerful data analysis. With the use of Power Pivot, data exploration and mash ups of the model become so much easier. Model can be exposed to Power View as a report ready element for analysis and to get data insights.
  • Power View. Power View allows end users to explore data in new ways to uncover hidden insights and bring data to life with interactive visualization. Power View is also available as a feature in SharePoint Server 2010 and SharePoint Server 2013 Enterprise Edition as part of the SQL Server 2012 Service Pack 1 Reporting Services Add-in.

power view reports


  • Power Map. Extends the capability of Power View by enabling end users to visualize data in 3D visualization format. It plots geographical and temporal data visually, analyze it in 3D visualization format and create an interactive, guided cinematic tour to discover new insights by seeing your data in geographic space and seeing time-stamped data change over time and to share it with others.


Database Mirror TimeOut

Here is SQL Server Error Log Message I detected on one of SQL Server 2008R2

“The mirroring connection to “TCP://server_name:5022″ has timed out for database “<dbname> ” after 10 seconds without a response. Check the service and network connections.”




database mirroring timeout message


The message is related to Database Mirroring failover procedure.

With the High Availability mode of synchronous Database mirroring with automatic failover there are three servers that are involved: Principal, Mirror and Witness.

The three servers constantly ping each other, forming a quorum, and if one of the machines is not available the other machines determine how to handle the failover.

In this particular instance the Principal instance didn’t not receive any responses to “Ping” for 10 seconds from Witness, as a result the failover was initiated after 10 sec without a response. 10 sec is a default Partner Time out settings.

Most common reasons why the pinging response failed

  • slow network connection or network reliability
  • geographically dispersed mirror partners
  • hardware/SAN/Vm failure

What can be done to avoid accidental database failover?

Increase Partner Time Out value.


What is Partner timeout?

  • The maximum period of time a server instance waits to get a “ping” message from another SQL Server instance in the mirroring session before considering that other instance to be disconnected
  • The TIMEOUT option only can be defined on the principal server
  • Specifies the time-out period in seconds
  • Option can not be set through the GUI
  • The default time period is 10 seconds.
  • If you set it for anything between 0 and 4 seconds it will automatically set the timeout to 5 seconds.  If you set it for 5 seconds or greater it will be set to that value.
  • Number specified must be an integer


To change Partner Timeout to 20 sec



Other Useful queries while troubleshooting Database Mirroring

Checking Timeout value

SELECT Mirroring_Connection_Timeout

FROM sys.database_mirroring

WHERE database_id = db_id(‘<dbname> ‘)


Query to check the associated ports with DB Mirroring

SELECT type_desc, port FROM sys.tcp_endpoints where type_desc like ‘%MIRROR%’;


Query to check the state of the DB Mirroring

SELECT state_desc FROM sys.database_mirroring_endpoints

Type In-row data is incorrect. Run DBCC UPDATEUSAGE.

The Issue:

Integrity Check (DBCC CHECKDB) produces the error

The In-row data USED page count for object “my_table”, index ID 9, partition ID 2609137898356736, alloc unit ID 2609137898356736 (type In-row data) is incorrect. Run DBCC UPDATEUSAGE.

CHECKDB found 0 allocation errors and 2 consistency errors in table ‘my_table’ (object ID 1157579162).

CHECKDB found 0 allocation errors and 2 consistency errors in database ‘my_database’.
Outcome: Failed


Database was just moved/upgraded from SQL Server 2000. SQL Server 2000 used to update the page space used metadata. After SQL Server 2005 the part of the migration/upgrade plan was to run DBCC UPDATEUSAGE. If you didn’t the error will appear when you run DBCC CHECKDB.

The output message above tell us that there is 1 consistency error in the database “my_database” and is a logical inconsistency. The error indicates that there are pages and row counts inaccuracies for that table and this type of errors can be fixed by running DBCC UPDATEUSAGE command.


Execute the DBCC UPDATEUSAGE(my_database) command in the context of your database you will see the following message indicating that were fixed the inaccuracies.

DBCC UPDATEUSAGE: Usage counts updated for table ‘my_table’ (index ‘my_table’ , partition 1):        RSVD pages (In-row Data): changed from … to … pages.DBCC execution completed. If DBCC printed error messages, contact your system administrator


After the page counts have been corrected your SQL Server will be back to normal. You can run DBCC CHECKDB after and it should not return any error messages.


Querying Linked Server with OPENQUERY

If you try to invoke a user-defined function (UDF) through a linked server in SQL Server by using a “four-part naming” convention (SQLserver.SQLdatabase.dbo.SQLfunction), you may receive error message.


Linked Server Error

The reason is User-defined function calls inside a four-part linked server query are not supported in SQL Server. That’s why error message indicates that the syntax of a Transact-SQL statement is incorrect.  

To work around this problem, use the OPENQUERY function instead of the four-part naming convention.


OPENQUERY executes the specified pass-through query on the specified linked server. This server is an OLE DB data source. OPENQUERY can be referenced in the FROM clause of a query as if it were a table name. OPENQUERY can also be referenced as the target table of an INSERT, UPDATE, or DELETE statement. This is subject to the capabilities of the OLE DB provider. Although the query may return multiple result sets, OPENQUERY returns only the first one.


OPENQUERY ( linked_server ,'query' )


linked_server: the name of the linked server.

 query ‘: string executed in the linked server. The maximum length of the string is 8 KB.


OPENQUERY does not accept variables for its arguments.

OPENQUERY cannot be used to execute extended stored procedures on a linked server. However, an extended stored procedure can be executed on a linked server by using a four-part name


Any user can execute OPENQUERY. The permissions that are used to connect to the remote server are obtained from the settings defined for the linked server.


    • pass-through UPDATE query against the linked server
UPDATE OPENQUERY (Link_Server, 'SELECT address FROM person.addresses WHERE id = 101') SET city = 'Melbourne';
    • pass-through INSERT query against the linked server 
INSERT OPENQUERY (Link_Server, 'SELECT address FROM person.addresses ') VALUES ('NewAddress');
    • pass-through DELETE query 
DELETE OPENQUERY (Link_Server, 'SELECT address FROM person.addresses WHERE address = ''NewAddress''');
    • SELECT pass-through query
SELECT * FROM OPENQUERY (Link_Server, 'SELECT SQLdatabase.dbo.SQLfunction(functionParam1, functionParam2)');

SQL Server Reporting Services Data Alerts

SQL Server 2012 introduced data alerts as a way to notify users of changes to report data, enabling them to view reports only when changes in the data may have business implications. By using this data driven alerting solution you no longer have to seek out information, it comes to you. 

Data alert messages are sent by email. Depending on the importance of the information, you can choose to send messages more or less frequently and only when results change. You can specify multiple email recipients and this way keep others informed to enhance efficiency and collaboration.

Good example of the data alerts can be sending out monthly sales report data only if total sales amount is less than a specified amount.


Data Alerts Key areas

  • To create data alerts on reports, the reports must be saved or uploaded to a SharePoint library. This means that you cannot create alerts on reports saved to a report server in native mode or your computer. Also, you cannot create alerts embedded in custom applications.
  • You can create data alerts on reports with data source configured to use stored credentials, or no credentials. You cannot create alerts on reports configured to use integrated security credentials or prompt for credentials. The report is run as part of processing the alert definition and the processing fails without credentials.
  • SQL Server alerting database is automatically created with Reporting Services in SharePoint mode install. Data alert definitions and alerting metadata are saved in the alerting database. By default, this database is named ReportingServices<GUID>_Alerting.
  • When the data alert definition saved, alerting creates a SQL Server Agent job for the alert definition. The job includes a job schedule. The schedule is based on the recurrence pattern defined on the alert definition. Running the job initiates the processing of the data alert definition.
  • An alert has only one owner, the person who created it. The alert owner can view information about your data alerts and delete and edit your data alert definitions
  • Alerting administrators, users with SharePoint Manage Alerts permission, can manage data alerts at the site level. They can view lists of alerts by each site user and delete alerts.
  • Reporting Services data alerts are different from SharePoint alert. SharePoint alerts on any document type, including reports. SharePoint alerts are sent when the document definition changes. Data alerts are sent when the data shown in a report satisfied rules in the alert definitions.
  • Multiple alerts on a report and multiple users can create the same or different alerts on a report
  • Report has to have data regions, to create an alert on it.


Here is Microsoft diagram for data alert definition

data alerts process


 Creating data alert definition

  1. Locate the SharePoint library that contains the report that you want to create a data alert definition for.
  2. Run the report. Verify that the report shows the data that you want to receive alert messages about. NOTE: The parameter values you chose to run the report are saved in the alert definition and will be used when report is rerun as a step in processing the alert definition. To use different parameter values, you must create a new alert definition
  3. On the Actions menu, click New Data Alert. The Data Alert Designer opens, showing the first 100 rows of the first data feed that the report generates in a table.

actions to create new data alert

4. Select data feed in the Report data name drop-down list.

  1. Click Add rule. In the list, select the column that you want to use in the rule, and then select a comparison operator and enter the threshold value.
  2. Select an option in the recurrence list. Depending on the type of recurrence, enter an interval.
  3. Enter the email addresses of alert message recipients. Separate addresses with semicolons. If the email address of the person who created the alert definition is available, it is added to the Recipient(s) box.
  1. Click Save.


data alerts scheduling


Data Alerts Maintenance

Once setup, alerts can be maintained by right clicking on the report requiring alert maintenance, and then selecting Manage Data Alerts 

The alert maintenance screen provides the following information: last run, status, sent alerts.  Alert can be edited to change items, such as, the rule criteria, the dataset, or the email recipient.



Microsoft’s SQL Server 2014 new features

HotLap Melbourne was a great opportunity to find out what’s new in the next release of SQL Server, SQL Server 2014. It was nicely presented by Peter Ward from 2 major topics were covered Database Engine Improvements & Power BI.


Key notes summary

SQL Server 2014 Benefits

Achieved through

SQL Server 2014 Features

Mission Critical Performance (getting high reliability as well as being fast
  • In-Memory Built-In for OLTP, DW, In memory analytics
  • Enhanced Security & Scalability
  • High Availability using AlwaysOn
  • Live support for mission critical solutions
  • Memory Optimised tables & Indexes
  • Compiling TSQL to dll
  • ColumnStore Indexes Clustered and Updateable
  • Buffer poll extension using SSD drives
  • Managing locks priority for online operations when rebuild index or switching table partitions
  • Enhancements to AlwaysOn, readable secondary replicas (up to 8)
  • New security roles for non sysadmin DBA
  • I/O Resource Governor
Platform for Hybrid Cloud (reduced cost per compute)


  • Hybrid Cloud Solutions
  • On-ramp to cloud
  • Complete & consistent platform
  • Cloud backup manual/auto with point in time restore
  • Cloud DR easy deploy and recovery
  • Cloud bursting – Ease to migrate and run SQL server in the cloud
  • Migration Wizards to Windows Azure VM
Faster Insight from Any Data (consume it from anywhere)


  • Easy data access
  • Familiar tools
  • Complete BI platform
  • Ability to combine external/internal/unstructured data
  • Excel/Office 365 enhancements
  • BI models spanning cloud/on-premises


Power BI explained


SQL Server Power BI


SSRS Reports Caching

In SSRS there are two options for pre-loading SSRS Reports: caching and snapshots

  • A cached instance of a report is based on the intermediate format of a report. The report server generally caches one instance of a report based on the report name. However, if a report can contain different data based on query parameters, multiple versions of the report may be cached at any given time. The first user who runs the report with a unique region code creates a cached report that contains data for that parameters. Subsequent users who request the report using the same parameters get the cached copy.
  • The other option is running a report from the report snapshot. Snapshot is a report that contains layout information and data refreshed at a specific point in time. A report snapshot is usually created and refreshed on a schedule, allowing you to time exactly when report and data processing will occur. A report snapshot is stored in the intermediary form in the ReportServer database, where it is subsequently retrieved when a user or subscription requests the report. When a report snapshot is updated, it is overwritten with a new instance. The report server does not save previous versions of a report snapshot unless you specifically set options to add it to report history.

Note: Copy of the processed cached report after the execution is saved into into ReportServerTempDB. Data source with stored credentials on the report server must be used in order to implement caching.


ReportServer Database ReportServerTempDB
Stores information about the report schema, report property, data sources, parameters, stores the folder hierarchy, report Execution log Stores only the cached copy of the report data
Always exists until changes are made to the RDL Schema Expires based Expiry settings
Helps to access the structure Helps to improve the performance of report execution as it is loading data from cache
Data always exists during SSRS service restarts Services restarts will clear the Temp Data, all cached instances are reinstated when the Report Server Web service comes back online

Refreshing the Cache

A cached report is replaced with a newer version when a user selects the report after the previously cached copy has expired. Reports that are configured to run as cached instances are removed from the cache at regular intervals based on expiration settings. You can set a report’s expiration in minutes or at a scheduled time, as determined by the data’s immediacy requirement. You cannot delete reports from the cache directly unless you use the SOAP API.

To configure cache expiration, you can use a shared schedule or report-specific schedule. If you use a shared schedule and it is subsequently paused, the cache does not expire while the schedule is inoperative. If the shared schedule is subsequently deleted, a copy of the schedule settings is saved as a report-specific schedule.

If a schedule expires or if the scheduling engine is unavailable at a cache expiration date, the report server runs a live report until scheduled operations can be resumed (by either extending the schedule or starting the scheduling service).

Preloading the Cache

To improve server performance, you can preload the cache. You can preload the cache with a collection of parameterised report instances in two ways:

  1. Create a cache refresh plan. When you create a refresh plan, you can specify a schedule for a single report or specify a shared schedule.
  2. Create a data-driven subscription that uses the Null Delivery Provider. When you specify the Null Delivery Provider as the method of delivery in the subscription, the report server targets the report server database as the delivery destination and uses a specialized rendering extension called the null rendering extension. In contrast with other delivery extensions, the Null Delivery Provider does not have delivery settings that you can configure through a subscription definition.

When you specify a schedule or when you create the data-driven subscription, you schedule how often the reports are delivered to the cache. In order for new copies to be delivered to the cache, the old copies must have expired. Therefore, the Execution properties of the report must be configured to include cache expiration settings. The expiration setting must be consistent with the subscription schedule that you define. 

For example, if you create a subscription that runs every night, the cache should also expire every night prior to the subscription’s run time. If the Execution properties do not include expiration times, newer deliveries are disregarded.


To schedule the expiration of a cached report

      1. Start Report Manager (SSRS Native Mode).
      2. In Report Manager, navigate to the Contents page. Navigate to the report for which you want to set caching properties, hover over the item, and click the drop-down arrow.
      3. In the drop-down menu, click Manage.
      4. In the left frame, click the Processing Options.
      5. On the page, select Always run this report with the most recent data.
      6. Select one of the following two cache options and configure expiration as follows:
        • To configure a cached copy to expire after a particular time period, click Cache a temporary copy of the report. Expire copy of report after a number of minutes. Type the number of minutes for report expiration.
        • To configure a cached copy to expire on a schedule, click Cache a temporary copy of the report. Expire copy of report on the following schedule. Click Configure, or select a shared schedule to control report expiration. My Schedule runs every morning

        Processing Options: Always run this report with the most recent data

      7. Click Apply.
      8. To create Cache Refresh Plan which create a schedule for preloading the cache with temporary copies of data for a report click Cache Refresh Options and click on New Cache Refresh Plan.
      9. Fill in the blanks, set Shared schedule. Then click on OK button.

ssrs cache refresh options


Global Site Schedules: one to generate cached report and the other to delete cashed report

expired cache schedule

report cache schedule


Some useful T-SQL queries (SSRS caching)

Check logging Enable:

Use [ReportServer]


select * from ConfigurationInfo where Name=’EnableExecutionLogging’


Get the current retention period for the Execution Log

Use [ReportServer]


select * from ConfigurationInfo where Name=’ExecutionLogDaysKept’


Execution Log details, including 
  • InstanceName
  • ReportPath
  • UserName
  • ExecutionId 
  • RequestType: {Interactive, Subscription}
  • Format  rendering
  • Parameters 
  • ReportAction: { Render, BookmarkNavigation, DocumentMapNavigation, DrillThrough, FindString, GetDocumentMap, Toggle, Sort }
  • TimeStart and TimeEnd
  • TimeDataRetrieval, TimeProcessing, TimeRendering
  • Source: { Live, Cache, Snapshot, History, AdHoc, Session, Rdce }
  • Status
  • ByteCount
  • RowCount
  • AdditionalInfoIn short, an XML property bag containing additional information about the execution.

select * from ExecutionLog 

select * from ExecutionLog2

select * from ExecutionLog3 


  • ExecutionLog is for backward compatibility and available with SQL 2008 onward.
  • ExecutionLog2 is introduced with SQL 2008 
  • ExecutionLog3 is introduced with SQL 2008 R2. 

ExecutionLog2 and ExecutionLog3 are similar the difference in column name ReportPath – ItemPath, ReportAction – ItemAction

Cashed reports metadata:

SELECT C.Name, EC.*, SD.*

FROM ReportServerTempDB.dbo.ExecutionCache AS EC

INNER JOIN ReportServer.dbo.Catalog AS C ON EC.ReportID = C.ItemID

INNER JOIN ReportServerTempDB.dbo.SnapshotData AS SD ON SD.SnapshotDataID = EC.SnapshotDataID