Showing posts with label SQL. Show all posts
Showing posts with label SQL. Show all posts

Redshift Query performance tuning

Redshift Query performance tuning tips

Clustered vs Serverless Redshift:
  • In a provisioned cluster, the resources allocated are predefined which can sometimes become bottleneck when running heavy workloads. However, Serverless Redshift can scale up dynamically according to the workload which can increase efficiency and saves money
 Order of Joins in SQL Query:
  • Always try to start joining tables with less data first and then move to the tables with alot of data
CTE vs Temp table:
  • Both CTE and temp table are good but try to use CTE's when you are dealing with less data. Use temp tables in your query when you are dealing with a lot of data.

Get Started with Azure Cosmos db - SQL API

Azure Cosmos DB

Introduction

An Azure Cosmos DB is a document-oriented and a non-relational database. It does not model data in tabular format instead the data resides in a document form which is a JSON format. So unlike the SQL server, the data in cosmos DB does not reside in the table-column structure.
It extends Azure DocumentDB which was released in 2014. it is schema-less and generally classified as a NoSQL database. Azure Cosmos DB is Microsoft’s proprietary globally-distributed, multi-model database service "for managing data at the planet-scale" launched in May 2017. It uses query language similar to SQL and supports ACID property of transactions.

Create a view in AWS Athena by merging two SQL queries using Union All

Today we will lean on how to create a view in AWS Athena by merging multiple queries result in one query

To merge multiple queries in Athena, we will be creating a view in Athena by using:
  • With common table expression (CTE) to hold the result of each query in a separate temp table
  • Union ALL to merge multiple queries results
    • To use Union keyword, both queries should return exact same columns

Power BI - Change your data source credentials (settings)

Scenario: You want to login to a power bi dashboard using someone else's credentials and refresh the data source.

Steps:
  • within Power bi Desktop, go to File > Options and settings > Data source settings

  • Right click your data source > Edit Permissions
  • Go to Database > Supply the new username and password
  • Click Save
  • Refresh the datasource
  • You are done.





Power BI - Could not load file or assembly System.EnterpriseServices

Scenario: when i open a power bi dashboard (.pbix) file, i get an error "Could not load file or assembly 'System.EnterpriseServices', Version=4.0.0.0)

Solution: Start your Power BI Desktop by running it as administrator. Now you will either pass through the error and your dashboard will open or you will see the actual error (most common: database login failure)


SQL: Violation of primary key constraint found duplicate key value

Scenario: we are inserting a value in our table that has a primary key constraint enabled. While inserting, we get an error: "Violation of primary key constraint 'xyz'. Cannot insert duplicate key in object 'dbo.customertbl'. The duplicate key value is (4).


Resolution: The error says that the value you are inserting in your primary key enabled column is already present in the column. Select your table and search for the value listed in your error "duplicate key value is (value)". 

You have two options:
  • Delete the value from the table (not recommended)
  • Check why the duplicate value is coming from the source query that you are inserting into this table?.
    •  Most likely, you might have to either alter your primary key and make it a composite key (multiple columns) or see maybe the data fetching query is not functioning properly (fix the query)

Azure Data Factory SqlErrorNumber 8672: A MERGE statement cannot UPDATE/DELETE the same row of the target table multiple times

Scenario: We have a copy activity in the data factory pipeline that, copies data from Azure Blob Storage (ADLS) to Azure SQL. in the sink, we are using SQL stored procedure to upsert the data using Merge statement. In the stored procedure, we are passing the Table type that holds the data from source.

While running the pipeline, it fails giving the below error message:
  • Error: The MERGE statement attempted to UPDATE or DELETE the same row more than once.
  • { "errorCode": "2200", "message": "ErrorCode=SqlOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A database operation failed with the following error: 'The MERGE statement attempted to UPDATE or DELETE the same row more than once. This happens when a target row matches more than one source row. A MERGE statement cannot UPDATE/DELETE the same row of the target table multiple times. Refine the ON clause to ensure a target row matches at most one source row, or use the GROUP BY clause to group the source rows.',Source=,''Type=System.Data.SqlClient.SqlException,Message=The MERGE statement attempted to UPDATE or DELETE the same row more than once. This happens when a target row matches more than one source row. A MERGE statement cannot UPDATE/DELETE the same row of the target table multiple times. Refine the ON clause to ensure a target row matches at most one source row, or use the GROUP BY clause to group the source rows.,Source=.Net SqlClient Data Provider,SqlErrorNumber=8672,Class=16,ErrorCode=-2146232060,State=1,Errors=[{Class=16,Number=8672,State=1,Message=The MERGE statement attempted to UPDATE or DELETE the same row more than once. This happens when a target row matches more than one source row. A MERGE statement cannot UPDATE/DELETE the same row of the target table multiple times. Refine the ON clause to ensure a target row matches at most one source row, or use the GROUP BY clause to group the source rows.,},],'", "failureType": "UserError", "target": "Copy data to Azure SQL", "details": [] }
Solution: On Analyzing, we found that in our SQL table, we had one column set to not null that we are no longer using in our database. after deleting this column from the SQL table, our pipeline ran successfully. 
  • If we require this column, then we would have checked if the source is sending the data in this column or not. if the data was not regular (we see some nulls), then we would have edited the schema of the table to make this column allow Nulls.

Azure data factory - datetime error in copy activity when loading data in SQL database

Today we will learn on how to deal with loading datetime fields in the database using Data Factory
Scenario: Suppose you have a datetime field in source that you are trying to load into database through data factory v2. you sometime gets an error that datetime is converted into datetime2 or datetime misses loading the datetime column into database. follow the below steps:

Steps:

  • In the copy activity, change the Mapping of the source to string.

  • Use stored procedure approach to load the data into SQL database
  • In the Sink section of copy activity, specify the stored procedure name, table type, and table type parameter
Alternate solution:
  •  Try changing the Mapping of the source to Datetime
    •  specify the date format "yyyy-MM-dd"
  • Run the pipeline
  • it will convert the date into the specified format



Passing dynamic parameters in SQL using dynamic Query


Today we will learn on how to dynamic SQL query by passing a dynamic parameter


Scenario: Suppose you want to fetch data from multiple servers and databases using one common SQL statement (change database and server name dynamically in one query). To achieve this, we have to user dynamic query in SQL.

Steps:
  • Create two variables:
    • Servername
    • Databasename
  • Declare the third variable (@SQL) as nvarchar(max) that will hold our SQL query.
  • use single quotes and '+' sign to parse the variables
  • To test the query, use "select @sql".
    • It will give you the final SQL statement formed after parsing the dynamic parameters
  • Once you are satisfied with the result of SQL, change the "select @SQL" to "execute (@SQL)"


Azure SQL - Script to create logins and give appropriate permissions to the user / login

Today we will learn on how to create a script that creates logins and give appropriate permissions within Azure SQL
Run the below script in your Azure SQL server:

--create a login. supply the login account name
-- login can also be a user's azure login email that you want to add to SQL to give permissions
use [master]
create login [<login account>] from external provider
go

----add the login to the database.
use [<Database name>]
go
create user [<login account>] from login [<login account>] with default_schema = [DBO];
go

---Create a db_executor role within the database
use [<Database name>]
go
create role [db_executor]
go
Grant execute on schema ::dbo to db_Executor
go


---give db executor permission to the user for the database
exec sp_addrolemember N'db_executor',N'<login account>'
go

---give data reader permission to the user for the database
exec sp_addrolemember N'db_datareader',N'<login account>'
go

---give data writter permission to the user for the database
exec sp_addrolemember N'db_datawriter',N'<login account>'
go

---give ddl admin permission to the user for the database
exec sp_addrolemember N'db_ddladmin',N'<login account>'
go

SQL - check invalid logins in your SQL Server

Today we will learn on how to scan all the logins created in SQL Server that are no longer valid

Run the below SQL query by replacing your SQL Server name:

IF (OBJECT_ID('tempdb..#invalidlogins') IS NOT NULL)
BEGIN
DROP TABLE #invalidlogins
END

IF OBJECT_ID('tempdb..#invalidlogins2') IS NOT NULL
DROP TABLE #invalidlogins2

CREATE TABLE #invalidlogins(
perrACCTSID VARBINARY(85)
, NTLOGIN SYSNAME)

CREATE TABLE #invalidlogins2(
ServerName Varchar(100)
, NTLOGIN SYSNAME)

INSERT INTO #invalidlogins
EXEC [<enter your server name>].master.sys.sp_validatelogins

if not exists(select * from #invalidlogins2 where NTLOGIN in (select NTLOGIN from #invalidlogins) and servername = '<enter your server name>'  )
begin
  insert into #invalidlogins2
select '<enter your server name>', NTLOGIN from #invalidlogins
end

truncate table #invalidlogins

SELECT * FROM #invalidlogins2 order by 1

IF OBJECT_ID('tempdb..#invalidlogins') IS NOT NULL
DROP TABLE #invalidlogins

IF OBJECT_ID('tempdb..#invalidlogins2') IS NOT NULL
DROP TABLE #invalidlogins2




SQL - Procedure to calculate Computed column values according to formulae stored inside table



Suppose if we have formulae's stored in our table row. e.g : for the above table for second row the formulae is 1+2 that means first row's value + second row's. we have to supply the ID field to determine the value and formulae.

AX 2012 - Make SSRS Report Using SQL Stored Procedure


Today we will learn how to create AX SSRS reports using SQL queries/Stored Procedure
Open SQL management studio and navigate to your dynamics AX database. Create a new stored procedure.



Write the query in store procedure according to your need.