MS SQL Server and Anti-Virus software

According Suhas’ (former Microsoft employee) blog http://blogs.msdn.com/b/suhde/archive/2009/04/08/introduction-to-database-corruption-in-sql-server.aspx “antivirus, that have the ability to take Kernel Level Handle on files, are also known to cause database corruption issues. In case you have Antivirus Installed on the system, please ensure that you exclude the SQL Server Database Files from Antivirus scans”.

You can also refer to Guidelines for choosing antivirus software to run on the computers that are running SQL Server.

 

Antivirus activity can cause your SQL Server performance degradation and data corruption. The following are some of the possible issues:

  • Antivirus activity on SQL Server file system can potentially cause high disk utilization
  • It can cause high CPU spikes or high CPU usage
  • On clustering environments, you may experience resource failures or problems when you try to move a group to a different node
  • Access denied to a certain SQL Server files as they been locked during the scan/monitor operation
  • It may originate stop errors on Windows Server

 

So what I do for the anti-virus to live in sync with SQL Server? 

When I install new SQL Server in our Melbourne office I always make sure to exclude all of the paths listed below from any type of scans or real-time monitoring

 

  1. Binaries. The paths to the actual executables for any of your running SQL Server Services (MSSQL, SQL Server Agent, SSAS, etc). These are found, by default, in the C:\Program Files\Microsoft SQL Server folder – though this could easily be a different path on many production machines. (note make sure that C:\Program Files (x86)\Microsoft SQL Server is included in any exclusions as well on x64 machines).
  2. SQL Server Error Logs. That is not database log files, but the text files that SQL Server uses to keep its own ‘event logs’ running or up-to-date.
  3. Data And Log Files. Actual .mdf, .ndf, and .ldf files – or the locations of data files and log files. Creation of new databases, file-growth operations, and other database operations can get blocked by anti-virus operations if you don’t exclude these files from anti-virus monitors
  4. Backups. Anti-virus doesn’t need to monitor all SQL server backup, so exclude those path locations from monitoring.

 

For the full list refer to the following AntiVirus exclusions list in PDf I compiled.

However as everything in IT world there is no one solution for various scenarios. If your SQL Server is running on a web server that is accessible from the Internet, then the anti-virus software need to be installed. Resource contention needs to be overcome by getting bigger and better hardware or optimising the current one to N-degree.

And don’t forget the Windows and SQL Server patching especially for critical security updates may help you avoid any security breach including viruses and mail ware.

SQL Enterprise Architect Summit – Melbourne

SQL Server Enterprise Architect Summit (SEAS)

SQL Enterprise Architect Summit was held in Microsoft Melbourne Office on 27 and 28 February 2013.  SEAS was a 2 day, intensive 400-level event designed for experienced database architects and database administrators. Link to Microsoft site

I attended this event. The biggest word now in Microsoft community is “Cloud”, cloud-hosted varieties, technologies and platforms. Microsoft is the only vendor who has range of integrated service in the cloud. Here are my notes.

Event Objective:

Learning and leveraging the new features and functionality of SQL Server 2012 and SQL Azure 2012 for the business critical solutions. Discover best practices, real world experience, design and deployment of mission-critical SQL Server enterprise solutions.

 

Microsoft Cloud Solution

Presenters:

Nicholas Dritsas and Emil Velinov – members of SQL Server Customer Advisory Team (CAT) in Redmond. Nicholas is a Principal Program Manager for SQL Server Product group at Microsoft. Emil Emil is a Senior Program Manager in the Windows Azure Customer Advisory Team (CAT) at Microsoft.

Event Topics:

1. How to scale Windows Azure with SQL Azure databases?

  • Windows Azure offers multiple services to manage enterprise data and applications in the cloud across a global network of Microsoft-managed datacenters
  • Subscription requirements
  • Microsoft provides cloud architecture fit for the purpose / performance/ high availability (3 physical nodes with a balancer)
  • PaaS – platform as a service (includes  SQL Service licences)
  • IaaS – infrastructure as a service (provision for own VHD/ VMs)
  • Windows Azure ideal for applications needing: Scalability, Availability and Fault Tolerance
  • Windows Azure Candidate: dispersed users and data, elastic demand (moving from Capex -> Opex), quick scale out
  • End 2013. Hybrid cloud. Data in house, secure VPN, supporting encryption
  • Win Azure consist of: Azure Compute | Azure Storage | Azure SQL databases

2. Best practices for installing and configuring SQL Server 2012 on Windows Azure IaaS

  • Cloud Offerings: IaaS, PaaS; SaaS
  • Microsoft has offers all 3 areas
  • Win Azure service: VM Role (stateless machines) and VM. Consist of infrastructure to deploy an application, persistent OS disk and data disks, internal/ external networking
  • You to define the Availability Set of your Virtual Machine – machines on different nodes and racks for HA
  • Best practice making sure that your databases are indeed highly available, whether you are using AlwaysOn Availability Groups or mirroring
  • General recommendation is to attach a separate data disk with no read or write caching and use it for database files. Attaching separate disks also enable you to customize disk capacity up to 1 terabyte.
  • If you need to store database files on the OS disk, it is recommended that you disable write caching when you provision the VM. You cannot disable read caching on the OS disk.

 

3. AlwaysOn HA/DR Design Patterns, Architectures and Best Practices

  • AlwaysON ≠ Availability Group
  • AlwaysOn is a integrated HA and DR solution consist of SQL Server Failover Cluster Instances (FCI) and Availability Groups
  • Availability Group is the ability to failover multiple databases as a unit to ensure the availability of all databases in the application
  • Works on contained databases (Logins is part of database backup)
  • Read only replicas can be used for backups, database maintenance, reporting
  • Advantages: multi-database failover, built in compression and encryption, automatic page repair, application failover using virtual name

4. Learn from the Largest Azure Projects in the World

  • Largest sharded Database – 20 TB, No of databases – 11000, No of cores – 24000, Microsoft Application size – 1 Exabyte, Customer Application size – 50 PB
  • Examples: Florida Election 2012, BING Games, Samsung Smart TV, MYOB, PotterMore.com, TVB
  • Common for all projects: Application bursting (Azure can provide dedicated cache), Application servers and database growing very fast with large number of connections and large number of deployments; frequent updates of application and/or software; large No of synchronised connection to Azure framework

5. Microsoft Big data

  • What is big data: high number of data velocity, data variety and data complexity
  • HDInsight is Microsoft’s 100% Apache compatible Hadoop distribution available Windows Server or as an Windows Azure service
  • Answers new type of questions eg What is social sentiment on …? How traffic / weather affect my business etc?
  • Microsoft Solution market Place:
    • Data Management – managing enterprise relational and non-relational data, streaming and multimedia
    • Data Enrichment – Discover, transform and govern data
    • Insight – real time and predictive analysis
    • Tools
      • PDW – Parallel Data Warehouse, appliances from HP/ Dell
      • Hadoop – distributed file system, processing done at the node through map reduce scheduling and execution component

6. Columnstore Index Implementation and Performance

  • ColumnStore (CS) –technology for query performance in data warehouse
  • In memory technology for relational engine
  • No need for summary / fact tables, indexed views, cubes -> lower hardware cost
  • CS in a new index type
  • Query processing in new mode – batches compare to rows
  • Accelerates targeted workload with highly efficient algorithm
  • Min/max values stored in metadata

How to Show a friendly message when “No Data to Fetch” in Business Objects reports

Receiving “No data to retrieve in <query>” or “There is no data corresponding to this query…” pop-up boxes is very confusing for end users. In addition your Web Intelligence (viewers and Java Report Panels) and Desktop Intelligence document may look not as intended. It will typically show column headers without any content, zero totals, and some formulas may even display awkward error messages.

Business Objects "No data to fetch"

Causes for the “No data to retrieve in” message

There could be lots of different factors. I list the most common issues

  1. Parameters and/or report conditions are too restrictive or negating each other.
  2. The data is missing in the table you query from the report
  3. Row-level restrictions for the current user
  4. Incorrect universe connections

However there are times when you may expect that report will not have any data for any or all queries. So how can we suppress this pop-up window

Suppressing the “No data to retrieve” message?

To overturn this message some Business Objects forum threads proposing the use of VB macro or some other SDK integration. Yet there are more simple ways to do this.

1. Use conditions in the report objects. Create a variable that will hold number of rows returned by data provider eg variable “RowCount”, formula =NumberOfRows(DataProvider(<MyQuery1>))

The next step is to assign condition to your table/chart to be visible when datasourse returns some data and hide it otherwise. Create “No data” block to display when datasource contains no data. Position both objects on top of each other

2. Create a union in your datasource query such that always one of the queries returns a value. I usually use my dummy empty objects for the second query. The only catch is that number of objects and objects types in the “Result Objects” pane must be the same. To keep you report clean you may need to add report level filter. Note dummy objects must reference dummy table or view in order to parse in Designer. 

How to move SQL Server user databases?

Moving user databases files to a new location is very common task for any SQL server DBA.

Planned reallocaiton

Since SQL Server 2005 there are 3 ways to accomplish this task.

    1. Taking database offline to move the data, log, and full-text catalog files of a user database to a new location by specifying the new file location in the FILENAME clause of the ALTER DATABASE statement. It’s performed to move database files within the same SQL Server server
    2. Backup and restore the database in the new location with or without move. The RESTORE … WITH MOVE option allows you to restore your database, but also specify the new location for the database files (mdf and ldf files)
    3. Detach and Attach database. It’s used to move a databases to another instance of SQL Server or to another server. There are some restrictions for this operation. It not supported for
      • System databases
      • Mirrored databases
      • Databases in replicaiton
      • Databases that have a snapshot

Note: sp_attach_db command is depreciating after SQL Server 2008. The recommend way is to  use CREATE DATABASE database_name FOR ATTACH instead.

I found lovely script helps migrating all user database from one server to a new server location. http://trentgardner.net/sql/move-sql-server-database-files.

Note: Method 2 & 3 is usually used to move user databases to another server. You also must transfer all databases metadata including the logins, jobs etc to the new server as well

 

Moving Database files within a server

Why would you do this? The most common problem is running out of space on the system drive for example when database was installed with application. Last week I received error message from UAT team

 

database log is full

After my investigation I noticed that database data and log files were installed on a C:\ drive. So the task is to move both data and log file to other drive with plenty of space.

Plan of action

 

  1. Obtain the logical file name, query the name column in the sys.master_files catalog view.
SELECT name, physical_name AS CurrentLocation, state_descFROM sys.master_files
WHERE database_id = DB_ID(N'shifttrack');

Database files location

2. Run the following statement.

ALTER DATABASE shifttrack SET OFFLINE

3. Move data and log files to the new location. 

4. Run the following statement.

ALTER DATABASE shifttrack MODIFY FILE ( NAME = shifttrack, FILENAME = 'D:\MSSQL\DATa\shifttrack.mdf)
ALTER DATABASE shifttrack MODIFY FILE ( NAME = shifttrack_log, FILENAME = 'D:\MSSQL\DATa\shifttrack_log.ldf)

Database files path changed

5. Run the following statement.

ALTER DATABASE shifttrack SET ONLINE

6. Verify the file change by running the following query.

SELECT name, physical_name AS CurrentLocation, state_desc
FROM sys.master_files
WHERE database_id = DB_ID(N'shifttrack');

 New file location verified

Some useful queries:

Check connection to the DB

USE master
SELECT * FROM sys.sysprocesses WHERE dbid = DB_ID('MyDB')

 To force disconnections

USE master
ALTER DATABASE MyDB SET SINGLE_USER WITH ROLLBACK IMMEDIATE

 

Reallocation due to failure

The sequence below is actually suggested by Microsoft

  1. Stop the instance of SQL Server if it is started.
  2. Start the instance of SQL Server in master-only recovery mode by entering one of the following commands at the command prompt.

For the default (MSSQLSERVER) instance, run the following command.

    NET START MSSQLSERVER /f /T3608

For a named instance, run the following command.

 o    NET START MSSQL$instancename /f /T3608

3. For each file to be moved, use sqlcmd commands or SQL Server Management Studio to run the following statement.

ALTER DATABASE database_name MODIFY FILE( NAME = logical_name , FILENAME = 'new_path\os_file_name' )

4. Exit the sqlcmd utility or SQL Server Management Studio.

5. Stop the instance of SQL Server.
6. Move the file or files to the new location.
7. Start the instance of SQL Server. Run: NET START MSSQLSERVER

8. Verify the file change by running the following query.

SELECT name, physical_name AS CurrentLocation, state_desc
FROM sys.master_files
WHERE database_id = DB_ID(N'<database_name>');

Update Statistics, Rebuilding Indexes, Reorganising Indexes

There are lots discussions and recommendations on the net around this topic among SQL Server community. What is the difference between index rebuilding and re-organising? When do you choose one over another? What the update stats do to your database? Do you need to update statistics after rebuilding indexes?

I’m pleased to share the findings of my extensive research into this very important topic of SQL Server database administration. Hopefully, the recommendations below will be useful to you. They have all been trialed and tested – I’ve played with my databases a lot over the years.

Update statistics facts

  • Update stats updates query optimisation statistics on a table or indexed view
  • By default, the query optimiser updates statistics as necessary to improve the query plan, however sometimes additional statistics need to be created for best results
  • Updating statistics ensures that queries compile with up-to-date statistics
  • Note that updating statistics causes queries to recompile
  • Take into account performance tradeoff between improving query plans and the time it takes to recompile queries
  • Update stats can use tempdb to sort the sample of rows for building statistics
  • Make sure AUTO_UPDATE_STATISTICS settings on the database is ON so that the query optimiser continues to routinely update statistics
  • Query execution times are slow.
  • Insert operations occur on ascending or descending key columns
  • After maintenance operations that change the distribution of data, such as truncating a table or performing a bulk insert of a large percentage of the rows

When to run Update Statistics

  • Query execution times are slow.
  • Insert operations occur on ascending or descending key columns
  • After maintenance operations that change the distribution of data, such as truncating a table or performing a bulk insert of a large percentage of the rows

When not to run Update Statistics

Generally I follow the principles outlined in Technet “Using Statistics to Improve Query Performance

  • Statistics does not to be updated after operations such as rebuilding, defragmenting, or reorganizing an index. Why? Because the distribution of data is not changed after those operations
  • In addition, when index on a table or view is rebuild with ALTER INDEX REBUILD or DBCC DBREINDEX the query optimizer updates statistics. This occurs as a byproduct of the internal implementation of these operations and is not done because statistics need to be updated
  • The query optimiser does not update statistics after defragmentation or reorganization operations because these operations do not drop and recreate the index

How it’s done in T-SQL?

Index Fragmentation

Modifications to the database can cause index fragmentation when indexes have pages in which the logical ordering, based on the key value, does not match the physical ordering inside the data file. Heavily fragmented indexes can degrade query performance and cause your application to respond slowly. What are the options?

According to Paul Randal if an index has less than 1000 pages and is in memory, don’t bother removing fragmentation. http://www.sqlskills.com/blogs/paul/post/where-do-the-books-online-index-fragmentation-thresholds-come-from.aspx. Other options are rebuilding indexes and reorganising indexes.

Rebuilding Indexes

  • Rebuilding an index drops and re-creates the index
  • It removes fragmentation
  • It compacts pages and reclaims disk space
  • Fill factor can be set during the operation
  • It reorders the index rows in contiguous pages
  • Rebuilding or reorganizing small indexes often does not reduce fragmentation as pages of small indexes are stored on mixed extents

Reorganising Indexes

  • Index reorganisation defragments the leaf level of clustered and nonclustered indexes on tables and views by physically reordering the leaf-level pages to match the logical
  • Reorganising compacts the index pages
  • Reorganising an index uses minimal system resources

How to rebuild / reorganise indexes?

  1. Run the following T-SQL to obtain size and fragmentation information for the data and indexes
SELECT object_id AS objectid, index_id AS indexid, partition_number AS partitionnum, avg_fragmentation_in_percent AS frag
FROM sys.dm_db_index_physical_stats (DB_ID(), NULL, NULL , NULL, 'LIMITED')

2. Check avg_fragmentation_in_percent value against Microsoft Best practices.

If  avg_fragmentation_in_percent value > 5% and < = 30% run ALTER INDEX statement with the REORGANIZE clause

If avg_fragmentation_in_percent value > 30% run ALTER INDEX statement with REBUILD clause

In my opinion Index and Statistics Maintenance script from Ola’s solution is the best for busy DBA’s. I use the scrips on majority of my databases with execution levels based on best practice guidance

To rebuild or reorganize indexes with fragmentation and update modified statistics on all user databases run the following

EXECUTE dbo.IndexOptimize @Databases = 'USER_DATABASES',
@FragmentationLow = NULL,
@FragmentationMedium = 'INDEX_REORGANIZE,INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
@FragmentationHigh = 'INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',
@FragmentationLevel1 = 5,
@FragmentationLevel2 = 30,
@PageCountLevel = 1000,
@UpdateStatistics = 'ALL',
@OnlyModifiedStatistics = 'Y'

SQL Server DBCC Commands

Database Consistency Checker DBCC Commands are used to get valuable insight into what’s going on inside SQL Server system including checking logical and physical consistency of database structures and fixing problems.

Some facts about DBCC commands

  • There are documented and undocumented DBCC commands. Use of undocumented commands is discouraged for SQL Server non-experts.
  • DBCC commands vary from SQL Server release to release
  • DBCC commands take input parameters and return values.
  • All DBCC command parameters can accept both Unicode and DBCS literals

What DBCC can be used for

  • Checking the whole database consistency
  • Checking the integrity of database tables and their related indexes
  • Examining the integrity of database pages
  • Inspecting individual pages allocation to a database, tables and indexes
  • Rebuilding the indexes on any given table as they become corrupted, unoptimised or just out of date
  • Clearing database memory buffer and cache for testing
  • Truncating SQL Server Error Log
  • Checking current query optimisation statistics for a table or indexed view
  • Shrinking the size of the specified database, data or log file for the current database
  • Inspecting transaction log space usage statistics for all databases

DBCC Categories

DBCC are grouped in to 4 major categories.

  1. Maintenance commands. Used for maintenance tasks in SQL Server database, index, or filegroup eg  DBCC DBREINDEX , DBCC DBREPAIR etc. Those DBCC commands operate on an internal read-only database snapshot that the Database Engine creates. This prevents blocking and concurrency problems when these commands are executed.
  2. Miscellaneous commands. They are mainly used for enabling trace flags, row-level locking or removing DLL from memory eg DBCC ROWLOCK , DBCC TRACEOFF etc
  3. Informational Commands. Used for checking some information about the database, connection, transaction and other objects eg DBCC OPENTRAN , DBCC SHOWCONTIG etc
  4. Validation Commands.  To perform validation operations on database, tables and other database objects or allocation of database pages like DBCC CHECKALLOC, DBCC CHECKCATALOG etc

To learn about all the DBCC commands run following script in SSMS.

DBCC TRACEON (2520)
DBCC HELP (‘?’)
GO

Business Objects Enterprise XI R3.1

What is BusinessObjects Enterprise?

BusinessObjects Enterprise is a flexible, scalable, and reliable solution for delivering powerful, interactive reports to end users via any web application – intranet, extranet, internet or corporate portal. It’s an integrated suite for reporting, analysis, and information delivery.

BO Enterprise provides a solution for increasing end-user productivity and reducing administrative efforts.

What is BusinessObjects Enterprise XI?

BusinessObjects Enterprise XI is the business intelligence platform based on industry-standard, proven architecture and platform support for semantic layers, data integration, and security.

It supports the entire range of performance management, reporting, querying, and analysis applications. It also provides

BusinessObjects Enterprise XI provides full web-based  administration and configuration of the entire system.

The architecture is designed to meet the needs of virtually any BI deployment, and is flexible enough to grow from several users with a single tool, to tens of thousands of users with multiple tools and interfaces.

BusinessObjects Enterprise components can be installed on one or many machines to provide flexibility, reliability, and scalability.

 

BO Enterprise Infrastructure

The Enterprise Infrastructure is a series of services that are designed to communicate via CORBA (Common Object Request Broker Architecture), which runs over TCP/IP.

The Enterprise Infrastructure provides the framework for establishing connections between clients and servers.

 

Business Objects Enterprise Architecture

 

BusinessObjects Enterprise Infrastructure is:

  • The centerpiece of BusinessObjects Enterprise technology
  • Facilitates communication between servers
  • Provide means of a client object make requests to server objects and server object in serving requests to client objects.

BusinessObjects Enterprise components

  1. BusinessObjects Enterprise web services
  2. BusinessObjects Enterprise management services
  3. BusinessObjects Enterprise storage services
  4. BusinessObjects Enterprise processing services
  5. Server Intelligence
  6. Life-Cycle Management
  7. Client Applications

Reporting Services migration and upgrade

There are two distinct options in updating SQL Server environment: upgrade and migrate.

  • Upgrade.It involves actual upgrade of technology on the servers and instances where they installed at the moment. Sometimes it’s called “in place” upgrade.To upgrade you need to run the SQL Server Installation Wizard. This will upgrade the report server program files, database, and all application data. You need to check what are prerequisites are, what versions and editions can be migrated. For SQL Server 2012 click here
  • Migrate. It includes more steps as upgrade as you need to install a new environment, copy your metadata and data to the new environment, and finally configure the new environment to use existing content.Before committing to migration review requirements to determine whether your hardware and software can support edition and version of SQL Server Reporting Services SSRS

Reporting Services migration checklist

  • Back up the encryption key. Use the Reporting Services Configuration tool to back up the key
  • Back up the report server database
  • Back up the following report server configuration files:
    • Rsreportserver.config
    • Rswebapplication.config
    • Rssvrpolicy.config
    • Rsmgrpolicy.config (Native mode only)
    • Reportingservicesservice.exe.config (Native mode only).
    • Web.config forthe report server ASP.NET applications).
    • Web.config for the Report Manager ASP.NET application (Native mode only).
    • Machine.config (for ASP.NET if you modified it for report server operations).
  • Move the report server database and other application files from your existing installation to a new SQL Server installation (use Install but do not configure option).
The easiest way to move the report server databases is to attach and detach them. However, this approach requires that you take the report server offline while you detach the database. Backup and restore is a better choice if you want to minimize service disruptions, use T-SQL commands script to backup and restore report server database and report server tempdb database
  • Configure the report server by following few easy steps
    1. Start the Reporting Services Configuration Manager and open a connection to the report server
    2. On the Database page, click Change Database. Click Next
    3. Click Choose an existing report server database. Click Next
    4. Select the SQL Server that now hosts the report server database and click Test Connection. Click Next
    5. In Database Name, select the report server database that you want to use. Click Next
    6. In Credentials, specify the credentials that the report server will use to connect to the report server database. Click Next
    7. Click Next and then Finish
  • Restore the encryption keys. This step is necessary for enabling reversible encryption on pre-existing connection strings and credentials that are already in the report server database.
  • Edit RSReportServer.config to include any custom settings from the previous installation
  • Optionally, configure custom Access Control Lists (ACLs) for the new Reporting Services Windows service group.
  • Test your installation
  • Remove unused applications and tools after you have confirmed that the new instance is fully operational

Note: A Reporting Services installation requires that the SQL Server Database Engine instance include the RSExecRole role. Role creation, login registration, and role assignments occur when you set the report server database connection through the Reporting Services Configuration tool.

 

Reporting services Migration tools

  1. Reporting Services Migration Tool.  It’s a Microsoft original tool. It provides UI and commnd line utility to migrates reports and other artifacts from one report server to another report server. It can also be used as a backup and restore tool for Reporting Services. Source and target server must be SQL Server Reporting Services 2008 R2 or later. Unfortunately the Target server must be SharePoint integrated mode only at the time of writing.
  2. Reporting Services Scripter. It’s a .NET Windows Forms application. It helps in scripting and transfer of all Microsoft SQL Server Reporting Services catalog items. It can also be used to easily move items on mass from one Reporting Services folder to another on the same server. Reporting Services Scripter can also transfer all catalog item properties such as descriptions, history options, execution options (including report specific and shared schedules), subscriptions (normal and data driven) and server side report parameters. Available only for SQL2000, SQL2005, SQL2008.
  3. Reportsync. The tool can be used to sync SSRS report between two report SSRS servers. Other use include download rdls from SSRS to local PC, upload files to a SSRS server, automatically attach datasources on upload

SQL Server 2008 R2 Tools

SQL Server 2008 R2 comprises number of new features and tools that can be used to develop and manage databases and database solutions.

SQL Server Management Studio

SQL Server Management Studio SSMS first launched with Microsoft SQL Server 2005. It’s used as an integrated environment for accessing, configuring, managing, and administering all components within SQL Server. Management Studio includes number of graphical tools and rich script editors.

Business Intelligence Development Studio

Business Intelligence Development Studio BIDS is an integrated environment for developing business intelligence solutions including Analysis Services cubes, data sources, Reporting Services reports, and Integration Services packages.

It is based on the Microsoft Visual Studio development environment with  customised SQL Server solutions and projects. One solution can include projects of different type and independent from specific server.

Reporting Services Components and Tools

SQL Server 2008 Reporting Services SSRS makes available Report server components for data and report processing and report delivery. It also include Report Designer, Report Builder, Model designer, Report manager and Reporting Services Configuration tool  to configure a Reporting Services installation.

Integration Services Tools and Utilities

SQL Server 2008 Integration Services SSIS provides complete SSIS package creation and management tools such as SSIS Designer for building data transformation packages that include control flows, data flows, and event-driven logic, Query Builder, Expression Builder, and several command prompt utilities for managing and running packages.

SQL Server Configuration Manager

SQL Server Configuration Manager is a tool to manage the services associated with SQL Server, to configure the network protocols used by SQL Server, and to manage the network connectivity configuration from SQL Server client computers.

SQL Server Profiler

SQL Server Profiler is a tool that captures SQL Server events from an instance of the Database Engine. The events can be saved in a trace file or in database for analysis and troubleshooting.

Database Engine Tuning Advisor

SQL Server Database Engine Tuning Advisor assists database administrator in creating effective indexes, indexed views, and partitions.

Command Prompt Utilities

The SQL Server Database Engine offers additional tools that can be run from the command prompt. Most popular tools are bcp, sqlcmd and tablediff. Bcp (bulk copy program) tool addresses the bulk movement of data into and out of a SQL Server database. Sqlcmd enables database administrator interacting with SQL Server from the command line. Tablediff command line utitlity provides the ability to compare the content of two tables in a database.