Workflow 1: Connect to SQL Server and Set the Correct Database Context
Most table-design mistakes in real projects are not syntax errors—they are “wrong place” errors: running a script against the wrong server, the wrong database, or the wrong schema. Your first repeatable workflow is to connect deliberately and verify context before you execute anything.
SSMS: Connect and verify context
- Connect: Open SSMS > Connect > Database Engine. Enter server name (for local:
localhostor.\SQLEXPRESS), choose authentication, then connect. - Confirm server: In Object Explorer, the top node shows the server you are connected to. If you have multiple connections open, expand the correct one before working.
- Open a query window in the right database: In Object Explorer, expand Databases, right-click the target database, choose New Query. This is safer than opening a blank query window and forgetting to change database.
- Verify with a quick check: Run
SELECT @@SERVERNAME AS server_name, DB_NAME() AS database_name;
Azure Data Studio: Connect and verify context
- Connect: In Azure Data Studio, open Connections > New Connection. Provide server, authentication, and optionally select a database.
- Confirm database context: In the editor, check the database selector in the toolbar (it shows the current database). If it is not correct, change it before running scripts.
- Verify with a quick check: Run
SELECT @@SERVERNAME AS server_name, DB_NAME() AS database_name;
Context-setting habit: start scripts with USE (carefully)
For scripts you share or re-run, explicitly set context near the top. This reduces accidental execution in the wrong database.
USE YourDatabaseName; -- set database context intentionally (review before running)If you work in environments where USE is restricted or you run scripts through tools that manage context differently, rely on opening the query window from the database node (SSMS) or selecting the database in the connection/editor (Azure Data Studio).
Workflow 2: Write and Execute Scripts Safely
Tooling is part of correctness. A safe execution workflow helps you avoid partial changes, unintended object creation, and hard-to-debug results.
Highlight what you intend to run
Both SSMS and Azure Data Studio will execute either the selection or the entire script depending on whether text is highlighted. Make it a rule: highlight the exact statement(s) you want to run when you are testing.
- Listen to the audio with the screen off.
- Earn a certificate upon completion.
- Over 5000 courses for you to explore!
Download the app
- SSMS: Highlight text and press
F5(Execute) or click Execute. - Azure Data Studio: Highlight text and run Run (or
F5depending on keybindings).
Understand batches and GO
GO is not a T-SQL statement; it is a batch separator recognized by client tools like SSMS and Azure Data Studio. Some statements must be the first statement in a batch, and separating batches can change variable scope.
-- Batch 1 (variables declared here are not visible after GO in Batch 2) DECLARE @x int = 1; GO -- Batch 2 SELECT @x; -- will fail because @x is out of scopeWhen you see errors that mention “must be the first statement in a query batch,” check whether you need a GO before the statement.
Use transactions for testing changes
When experimenting with table changes or data modifications, wrap your test in an explicit transaction so you can roll back if the result is not what you want. This is especially useful when you are learning or when you are unsure how constraints will react.
BEGIN TRAN; -- test changes here -- Example: try an INSERT or an ALTER TABLE (where allowed) -- If satisfied: COMMIT; -- If not: ROLLBACK; ROLLBACK;Notes for practice: some DDL statements can be rolled back in SQL Server, but not all operations behave the same in every scenario (for example, certain operations may take schema locks or be blocked). In learning environments, the transaction pattern is still a strong habit for controlled testing.
Read Messages and Results deliberately
After execution, check both:
- Results grid: confirms returned data.
- Messages tab/pane: shows “Commands completed successfully,” row counts, warnings, and error details.
Row count messages like “(1 row affected)” are a quick sanity check that you changed what you intended.
Workflow 3: Explore Table Definitions (Design, Scripts, Constraints, Indexes)
Correct table design is not only about writing CREATE TABLE. You must also be able to inspect what actually exists in the database: columns, nullability, defaults, constraints, and indexes. Both SSMS and Azure Data Studio support inspection, but SSMS is more feature-complete for design-time exploration.
SSMS: Design view and key properties
- View columns quickly: Object Explorer > Tables > right-click table > Design. You can see column names, data types, and nullability.
- Column properties: In design view, select a column and review the Column Properties pane (identity, computed, collation, etc.).
- Constraints and keys: In Object Explorer, expand the table node to see Keys, Constraints, Indexes, and Triggers.
SSMS: Generate scripts (the most reliable inspection)
Design view is convenient, but scripts are unambiguous. Use script generation to confirm the exact DDL SQL Server will use.
- Object Explorer > right-click table > Script Table as > CREATE To > New Query Editor Window.
- Also useful: Script Table as > ALTER To to see change-oriented scripts.
Azure Data Studio: Inspect via scripts and system views
Azure Data Studio is optimized for editing and running scripts. It may not expose every design UI that SSMS does, so rely on script-based inspection.
- Object Explorer: Expand your connection > Databases > your database > Tables. Right-click a table to see available actions (varies by version/extensions).
- Use built-in metadata queries: Query system views to inspect columns and constraints.
-- Columns and types SELECT c.name AS column_name, t.name AS data_type, c.max_length, c.precision, c.scale, c.is_nullable FROM sys.columns AS c JOIN sys.types AS t ON c.user_type_id = t.user_type_id WHERE c.object_id = OBJECT_ID('dbo.YourTableName') ORDER BY c.column_id; -- Constraints (high-level) SELECT o.name AS constraint_name, o.type_desc FROM sys.objects AS o WHERE o.parent_object_id = OBJECT_ID('dbo.YourTableName') AND o.type IN ('D','C','F','PK','UQ');These queries help you confirm what exists even when UI options differ.
Workflow 4: Read and Interpret Common Error Messages (Data Types and Constraints)
SQL Server error messages often look intimidating, but they are structured. Train yourself to extract: (1) what operation failed, (2) which object/constraint, (3) which value or column, and (4) what to fix.
Common data type errors
- Conversion failed: typically means SQL Server could not convert a string to a numeric/date type (or similar). You will see messages like
Conversion failed when converting the varchar value 'abc' to data type int.Fix by correcting the input, using the right parameter type, or explicitly converting with validation. - Arithmetic overflow: value does not fit the target type/precision. Example: inserting a large number into a small numeric type. Fix by choosing a type that can store the range or adjusting precision/scale.
- String or binary data would be truncated: a value is longer than the column length. Modern SQL Server versions often include the table and column in the message. Fix by shortening the value or increasing the column length.
- Operand type clash: incompatible types in an operation (for example, trying to store a
uniqueidentifierinto anintcolumn). Fix by aligning types across columns/variables/parameters.
Common constraint-related errors
- Cannot insert the value NULL: you inserted/updated without providing a value for a NOT NULL column (and no default applied). Fix by providing a value or adding a default where appropriate.
- Violation of PRIMARY KEY or UNIQUE constraint: duplicate key value. The message names the constraint and often shows the duplicate key. Fix by preventing duplicates, changing the key, or correcting the data load.
- CHECK constraint violated: the value does not satisfy a rule. The error names the check constraint. Fix by correcting the value or adjusting the rule.
- FOREIGN KEY constraint conflict: you referenced a parent key that does not exist, or attempted to delete a parent row referenced by children. Fix by inserting parent rows first, correcting keys, or adjusting delete/update behavior.
Tooling tip: use the error line number
Errors usually include “Line X” referencing the batch. If you use GO, line numbers reset per batch in many tools. If the line number seems “wrong,” check whether the failing statement is in a later batch.
Guided Exercise: Create a Table from a Script, Confirm It, Script It Back Out, Then ALTER It
This exercise builds a repeatable workflow: create from script, verify in Object Explorer, inspect generated DDL, then make a controlled change and observe the effect. Use a dedicated practice database and schema you are allowed to modify.
Step 0: Confirm you are in the right database
SELECT @@SERVERNAME AS server_name, DB_NAME() AS database_name;If the database name is not your practice database, switch context using the database selector (Azure Data Studio) or open a new query window from the database node (SSMS), or run USE after reviewing it carefully.
Step 1: Create the table from a script (run as a single, intentional unit)
Highlight the whole script below and execute it. It creates a simple table with a primary key, a uniqueness rule, and a check rule so you can observe constraint behavior in tooling.
CREATE TABLE dbo.ToolingPractice_UserAccount ( UserAccountId int IDENTITY(1,1) NOT NULL, EmailAddress varchar(254) NOT NULL, DisplayName nvarchar(100) NOT NULL, CreatedAt datetime2(0) NOT NULL CONSTRAINT DF_ToolingPractice_UserAccount_CreatedAt DEFAULT (sysdatetime()), CONSTRAINT PK_ToolingPractice_UserAccount PRIMARY KEY (UserAccountId), CONSTRAINT UQ_ToolingPractice_UserAccount_Email UNIQUE (EmailAddress), CONSTRAINT CK_ToolingPractice_UserAccount_EmailHasAt CHECK (EmailAddress LIKE '%@%') );Observe the Messages output to confirm success.
Step 2: Confirm the table exists in Object Explorer
- SSMS: In Object Explorer, expand your database > Tables. If you do not see the table, right-click Tables > Refresh.
- Azure Data Studio: In the Connections/Object Explorer view, expand your database > Tables. Refresh if needed.
Also confirm with a query (tool-independent):
SELECT OBJECT_ID('dbo.ToolingPractice_UserAccount') AS object_id;Step 3: Review the generated script (what SQL Server thinks you created)
In SSMS, right-click the table > Script Table as > CREATE To > New Query Editor Window. Compare the generated script to what you wrote. Pay attention to:
- Constraint names (did they match what you expected?)
- Nullability and data types
- Default constraint placement
In Azure Data Studio, if scripting options are limited, use metadata queries to inspect constraints:
SELECT o.name AS constraint_name, o.type_desc FROM sys.objects AS o WHERE o.parent_object_id = OBJECT_ID('dbo.ToolingPractice_UserAccount') AND o.type IN ('D','C','F','PK','UQ') ORDER BY o.type_desc, o.name;Step 4: Test constraint behavior in a controlled way (use a transaction and roll back)
Run the following as a test. It intentionally triggers errors so you can practice reading messages. Keep it inside a transaction and roll back.
BEGIN TRAN; -- 1) Should succeed INSERT INTO dbo.ToolingPractice_UserAccount (EmailAddress, DisplayName) VALUES ('alex@example.com', N'Alex'); -- 2) Should fail CHECK constraint (no @) INSERT INTO dbo.ToolingPractice_UserAccount (EmailAddress, DisplayName) VALUES ('not-an-email', N'Bad Email'); -- 3) Should fail UNIQUE constraint (duplicate email) INSERT INTO dbo.ToolingPractice_UserAccount (EmailAddress, DisplayName) VALUES ('alex@example.com', N'Duplicate'); ROLLBACK;When an error occurs, read the message for the constraint name (for example, CK_... or UQ_...). That name tells you exactly which rule blocked the change.
Step 5: Make a controlled change with ALTER TABLE and observe the effect
Now you will add a new column with a default. This is a common, safe schema evolution step for learning tooling workflows. Execute the following statement (highlight it and run it):
ALTER TABLE dbo.ToolingPractice_UserAccount ADD IsActive bit NOT NULL CONSTRAINT DF_ToolingPractice_UserAccount_IsActive DEFAULT (1);Observe effects in three places:
- Object Explorer: refresh the table and check that the column exists.
- Script generation: script the table again in SSMS and confirm the new column and default constraint appear.
- Metadata query: verify the default constraint exists.
SELECT c.name AS column_name, c.is_nullable, dc.name AS default_constraint_name, dc.definition FROM sys.columns AS c LEFT JOIN sys.default_constraints AS dc ON dc.parent_object_id = c.object_id AND dc.parent_column_id = c.column_id WHERE c.object_id = OBJECT_ID('dbo.ToolingPractice_UserAccount') ORDER BY c.column_id;Step 6: Practice a safe “change attempt” pattern (preview, then apply)
Before making further changes, get in the habit of previewing what you are about to change.
- Preview current definition: script the table (SSMS) or query metadata (Azure Data Studio).
- Apply one change at a time: one
ALTER TABLEper step while learning. - Re-inspect: refresh Object Explorer and re-run metadata queries to confirm the change.