Отправляет email-рассылки с помощью сервиса Sendsay

MS SQL Server

  Все выпуски  

MS SQL Server - дело тонкое...


Информационный Канал Subscribe.Ru

#254<<  #255

СОДЕРЖАНИЕ

1.СТАТЬИ
1.1.Основы I/O в SQL Server 2000 (продолжение)
2.ССЫЛКИ НА СТАТЬИ
2.1.Статьи на русском языке
2.2.Англоязычные статьи
3.ФОРУМ SQL.RU
3.1.Самые популярные темы недели
3.2.Вопросы остались без ответа
4.АНОНСЫ
4.1.Новый конкурс на GotDotNet.ru!

Microsoft Data Access Components (MDAC) 2.8 SP1

Доступны для скачивания обновлённые клиентские копоненты доступа к SQL Server.

Microsoft Data Access Components (MDAC) 2.8 SP1 содержит те же основные компоненты доступа к данным, что и Microsoft SQL Server OLE DB, и драйвер ODBC.

Русская редакция: MDAC_TYP.EXE - 5971 Кбайт
Английская редакция: MDAC_TYP.EXE - 5958 Кбайт

Обзор

Microsoft Data Access Components (MDAC) 2.8 SP1 содержит те же основные компоненты доступа к данным, что и Microsoft SQL Server OLE DB, и драйвер ODBC. Установщик версии MDAC 2.8 SP1 устанавливает те же основные компоненты доступа к данным, что и операционная система Microsoft Windows XP SP2.

Этот выпуск не включает Microsoft Jet, поставщик Microsoft Jet OLE DB и драйверы Desktop Database Drivers ODBC или Visual FoxPro ODBC.

Системные требования:

  • Поддерживаемые операционные системы: Windows 2000, Windows 98, Windows ME, Windows NT

  • Может использоваться только на компьютерах с процессорами семейства x86

СТАТЬИ

Основы I/O в SQL Server 2000 (продолжение)

По материалам статьи Bob Dorr: SQL Server 2000 I/O Basics
Перевод Александра Гладченко

Требования I/O к ядру Microsoft SQL Server

Подсистема I/O в SQL Server 2000 на уровне ядра запрограммирована на обеспечения всех обязательных требований по поддержке целостности данных. Если ваша система полностью адекватна требованиям, представленным в следующих далее разделах статьи, тогда SQL Server будет способен выполнять требования ACID для ваших баз данных.

Долговременные носители

Любая операция в системе с SQL Server должна обеспечить предъявляемые к долговременным носителям требования для журналов транзакций и файлов баз данных. Если в системе используются кэширующие дисковые контроллеры без автономной батареи или возможно дополнительное кэширование записываемого на диск, это не безопасно для экземпляра SQL Server.
Убедитесь, что ваша система правильно использует кэширование и обеспечивает безопасный транспорт данных с долговременным носителем.

Порядок записи

SQL Server разработан с учётом необходимости поддержки протокола WAL, который был описан выше. В поддержке этого протокола основную роль играет среда исполнения, которая должна поддерживать порядок записи.
Для любой системы, которая использует удалённое зеркалирование, порядок записи является критичным для поддержания последовательности меток времени. Многие поставщики систем удалённого зеркалирования реализуют в своих системах логику поколений, которая призвана поддержать физический порядок записи, даже когда операции записи передаются между разными кэшами с использованием дополнительных протоколов.

Предотвращение прерывания I/O (разбиение I/O)

В SQL Server блоки по 8 Кб должны использоваться как единый блок данных. Системы, которые разбивают I/O, должны настраиваться таким образом, чтобы они не раскалывали запросы I/O на блоки меньшего размера. Некоторые динамические диски и менеджеры томов могут устанавливать размер блока, равный размеру кластера, которые могут быть меньше 8 Кб или около 8 Кб. Такие системы могут разделить запрос SQL Server на I/O между несколькими физическими компонентами системы. При этом, они могут привести к появлению оборванных страниц, и этим нарушить порядок следования записи.
Убедитесь, что ваша система не дозволяет разбиения данным по указанной причине, и не приводит к появлению оборванных страниц.

Проблемы, ловушки и их примеры

Проблемы, ловушки и примеры, описанные в этой главе, являются проблемами, с которыми сталкивалась служба поддержки Microsoft и сотрудники департамента разработки SQL Server. Многие из них привели к изменениям в конфигурации и/или другим изменениям, например, в программном обеспечении сервера, драйверах или на других уровнях, а также к исправлениям вне SQL Server.
Поскольку могут возникнуть совершенно разные проблемы и в сфере долговременных дисковых носителей задействовано множество их изготовителей, Microsoft рекомендует обращаться к вашим поставщикам аппаратного и программного обеспечения за консультациями по поводу того, является ли Ваша конкретная реализация приемлемой для взаимодействия с SQL Server 2000.

Чтение устаревших данных и оборванная запись

Что такое чтение устаревших данных на аппаратном уровне и чем оно отличается от оборванной записи?
Поскольку чтение устаревших данных может проявиться как оборванная запись, или оборванная запись может появиться как чтение устаревших данных, стоит явно определить эти термины.
Следующие определения предполагают, что все запросы операционной системы исполняются успешно и приложением непривилегированного режима правильно используются соответствующие API.
Чтение устаревших данных (stale read) происходит тогда, когда по данным, возвращаемым через запросы ReadFile или ReadFileScatter, не предоставляется информация об успешности последней операции записи.
Оборванная запись (lost write) определяется тем, что данные, посланные через WriteFile или WriteFileGather, никогда не попадают на долговременный носитель.
Посмотрите внимательно на следующий ниже рисунок, исходя из допущения, что долговременные носители можно абстрагировать к образу ExpandFile.


Рисунок 2

Чтение устаревших данных возникает тогда, когда изображённая на рисунке запись, показанная, как заполнение символом "A", будет успешно записана на диск, заменяя изначальную, с символами "Z". Однако, при следующем чтении этих же байт (смещение) в файле, будет всё еще возвращаться значение с "Z", а актуальные на этот момент данные с символами "A" будут недоступны.

  • чтение устаревших данных имеет место тогда, когда долговременный носитель содержит данные с "A", но аппаратный кэш возвращает данные с "Z".

  • оборванная запись имеет место тогда, когда, когда долговременный носитель содержат символы "Z", а запись фактического значения с "A" не была доведена до успешного завершения.

Пример чтения устаревших данных

В этом примере мы рассмотрим возможную ситуация чтения устаревших данных. Вспомним, что упреждающее чтение в SQL Server читает большими блоками, но при этом игнорируются страницы, состояние которых неизвестно. После завершения чтения, буфер будет немедленно возвращён в свободную область, и страница считается сохранённой к началу I/O запроса на упреждающее чтение.
В нашем примере, предположим, что страницы 107 и 108 находятся в буферном пуле SQL Server и считаются грязными.

Осуществляется запись страницы 107

По запросу программы отложенной записи.

Осуществляется запись страницы 108

По запросу программы отложенной записи

Завершается запись страницы 107

Завершается запрос на I/O, а программ отложенной записи удаляет страницу из кэша и размещает её буфер в свободную область.

Поступает запрос на чтение страниц 100 - 115.

Для данных используется упреждающее чтение. В памяти найдена только страница 108. Упреждающее чтение осуществляет необходимые действия по буферизации но хеширование не выполняется, поскольку данные уж находятся в памяти.

Чтение завершено

Завершение чтения. Ключевой момент: данные страницы 108 неизвестны, поскольку они ещё не записаны на аппаратном уровне. SQL Server не плодит дубликатов, поэтому задействованный буфер будет возвращён в свободную область непосредственно по завершении чтения.
Примечание: Эта версия страницы 108 в настоящее время хранится в аппаратном кэше упреждающего чтения.

Завершается запись страницы 108

BUG: Аппаратный кэш упреждающего чтения ещё хранит эту страницу.
Программа отложенной записи выдает страницу 108 из кэша и размещает её буфер из свободной области.

Поступает запрос на чтение страницы 108

Правильным было бы выполнить физическое чтение, но из аппаратного кэша упреждающего чтения будет извлечён образ устаревших данных, а не его версия на диске.

Флаг трассировки 818

Инструментарий, который включается флагом трассировки -T818, отслеживает операции записи последних 2048 страниц. При успешном завершении I/O записи (верный ID страницы, успешная передача байт, и соответствующие коды ошибок операционной системы), DBID, ID страницы и LSN помещаются в закольцованном буфере. В случае возникновения отказа, будет выставлена ошибка 823.
Когда обнаружена ошибка 823 или 605, SQL Server просматривает кольцевой буфер, ища там значение LSN, которое было зафиксировано для последней, записанной страницы. Если он некорректно, об этом в файл регистрации ошибок SQL Server добавляется дополнительная информация, которая указывает тип ошибки, а также ожидаемый и обнаруженный номера LSN.
Дополнительная информация об LSN, о котором идёт речь, появится в файле регистрации ошибок SQL Server. Возвращаемый после чтения LSN явно старше (устаревший), чем значение, которое соответствует последней записи.

SQL Server has detected an unreported OS/hardware level read or write problem on Page (1:75007) of database 12 LSN returned (63361:16876:181), LSN expected (63361:16876:500) Contact the hardware vendor and consider disabling caching mechanisms to correct the problem

С появлением SQL Server 2000 Service Pack 4 (SP4), дизайн -T818 лучше использует хэш-таблицы. Стало доступно более 2048 записей в хэш-памяти для 32-битных редакций, а 64-битных даже ещё больше.
Поскольку мы имеем реализацию в виде хэш-таблицы, проверка чтения устаревших данных может выполняться при каждом чтении, а не только для тех страниц, которые выявлены по предыдущим ошибкам 823 или 605. Эта проверка подобна другому стандарту, по которому проверяется ID объекта страницы (605) и состояние ошибки номера страницы (823). Поскольку проверка выполняется при каждом чтении, это позволяет охватить даже те ситуации, когда ID страницы и ID объекта являются правильными, но строки страницы были повреждены.
Например, если была вставлена строка, страница была сброшена на диск и произошло чтение устаревших данных (строки отсутствуют на диске), ID объекта и ID страницы будут правильными, и до появления SP4, возможно, чтение устаревших данных не было бы зафиксировано. Новый дизайн обнаружит несоответствие LSN и выставит ошибку 823.

Оборванные страницы

Страницы базы данных SQL Server занимают 8 Кб, в то же время, типичный размер передаваемого на аппаратном уровне блока равен 4 Кб, если используются 512-байтные секторы диска. Когда Вы используете конфигурацию с RAID, незаполненные части могут проявиться, как оборванное чтение. Синхронизация чтения и записи может внести путаницу для разных дисков, так что могут быть извлечены частично старые и частично новые данные. Опять, ошибка состоит в том, что после записи, не все части кэша упреждающего чтения были до конца переданы на диск. Не правильный образ сохраняется в аппаратном кэше упреждающего чтения, пока он не будет извлечён принудительно.

Пример сброса на диск аппаратного кэша

Есть операции, которые принудительно сбрасывают на диск аппаратный кэш. Сброс на диск может разрешить ошибки, если они являются переходными.
Существует не много методов, которые могут быть вызваны в SQL Server, что бы достичь этого эффекта:

  • Запуск dbcc dropcleanbuffers, после чего будут удалены все буферы из буферного пула.

  • Запуск dbcc checkdb для базы данных, имеющих соответствующие проблемы.

Эти методы позволяют исправить проблему перехода за счёт повторения не правильных действий. Данная методика заставляет checkdb создать большое количество запросов на чтение, которые вынуждают прокручивать аппаратный кэш. Эта прокачка кэша выбирает из аппаратного кэша кэшируемые данные секторов и провоцирует правильное физическое чтение. В результате, будет получен правильный образ и SQL Server чудесным образом исправит проблему.

ПРОДОЛЖЕНИЕ СЛЕДУЕТ

[В начало]

ССЫЛКИ НА СТАТЬИ

Статьи на русском языке

Мониторинг и повышение эффективности информационных систем: подход Embarcadero
Владимир Дудченко
CASE: Проблемы повышения эффективности ИТ при экстенсивном пути наращивания информационных ресурсов известны сегодня уже очень многим руководителям ИТ структур. Стоимость серверных платформ, СУБД и прочих инфраструктурных компонент по-прежнему высока, однако результаты больших вложений в наращивание их мощности не всегда соответствуют ожиданиям - производительность, надежность, время реакции на запросы пользователей, устойчивость в условиях пиковых нагрузок зачастую не достигают значений, диктуемых требованиями бизнеса, и не оправдывают сделанных инвестиций...

Англоязычные статьи

Detecting Stalled and Stuck I/O as it relates to SQL Server 2005
Blog Entries
The other day I pointed all of you to Robert Dorr’s new paper titled, "Detecting and Resolving Stalled and Stuck I/O Issues in SQL Server 2000 SP 4". The paper describes the introduction of a new feature for detecting stalled I/O. For SQL Server 2000, this feature is turned on with trace flag –T830

Algorithmic Trading
TSQLNet Weblog - Cihangir Biyikoglu
I recently started reading about Algorithmic Trading. There is a huge potential here for computer science. According to this presentation by Carl Carrie - JPMorgan & Scott Atwell - Manager of FIX Trading and Connectivity

Recursion and FOR XML
Michael Rys
Kent just published an example on how to get recursive XML structures (think employee-manager). The new version of the FOR XML whitepaper actually contains a section with such a sample (I need to get the MSDN team to publish it). I don't want to spoil the content of the whitepaper here, but I would like to add some comments to Kent's approach (I am sending some comments on minor coding issues to Kent directly)

Nice one Maestro - the next generation of the Business Scorecard Accelerator is now in beta 1
Mat Stephen's WebLog
Thank heavens – Microsoft is going to charge for its next version of its business scorecard solution code named Maestro, beta 1 of which has just been announced. Why is this charge a good thing? Because if people have to pay for this product then it will hopefully be seen to have value and it should also get the marketing, attention and respect it deserves

Database snapshots to feed a data warehouse?
SQL BI
I just attended the pre-conference of Kimberly Tripp at SQL Pass 2005 Europe. The topic was "New SQL Server 2005 features that a DBA need to learn" and I took advantage of this day to cover an area of SQL2005 I still haven't explored

Fun with Nested Tables
DMTeam
This tip describes the modeling impact of setting usage flags on nested tables and the columns within. Nested tables in Mining Models allows for a great deal of flexibility and creativity in how you model business problems using SQL Server Data Mining by allowing you to combine demographic or dimensional style data with transactional data in the same model. However, due to the semantics of the language, understanding how your usage flags impact the model can be somewhat confusing. In this tip we’ll review how the usage flags impact how attributes are created for the model when nested tables are used

SQL Server 2005 T-SQL: Aggregates and the OVER clause
adam machanic
A new feature added to SQL Server 2005 for the sake of the windowing functions is the OVER clause. Using this clause, you can specify ordering or partitioning for the windowing functions. For instance, to enumerate the names of all of the products in the AdventureWorks database that have a list price, along with their list prices and the rank of those prices compared to all of the other prices, the following query can now be used

Build a Visual Studio Add-in to Evaluate XPath/XQuery Expressions
Francis A. Shanahan
Visual Studio provides a solid toolset out of the box. The Solution Explorer, Properties window, Class Explorer, Command window, and the rest each play an integral role in the development process. But, since Microsoft couldn't anticipate every need a developer might have, there are times when it's necessary to go beyond the default IDE to perform a particular task. Visual Studio supports expansion through the IDTExtensibility2 interface. In this article, Francis A. Shanahan shows you how to implement the IDTExtensibility2 interface and create a simple but useful Add-in to execute XPath or XQuery expressions against an XML document without ever leaving Visual Studio

Regular Expressions in T-SQL
Ken Henderson's WebLog
Given that a recent blog of mine lamented the dangers of using xprocs, I thought it would be a good time to show some of the wonderful stuff you can do thanks to SQL Server extensibility features such as xprocs and COM objects. It’s not that xprocs or in-process COM objects are inherently evil -- it’s just that they can be misused and that they can be challenging for the average developer to code properly

Stored Procedures, Bad?
The SQL Doctor is In
Apparently all of this time I was using them to build well organized, well structured systems I was wrong. Sarcasm aside, I kind of felt that I had to say something, if not very complete. Coming to you through the TechEdBloggers.net RSS feed, then through Ron Crumbaker's blog, comes this, from a C# MVP. It is an old post from November, 2003 so possibly his mind has been changed? Bringing this up seems like digging into a old wound, but I didn't start it. I would like to hear other opinions however

An Enigmatic Error while Returning UDT Results
SQL Server 2005: CLR Integration
Let's say we have a very simple (and silly) CLR UDT defined and loaded into SQL Server named CLRInt, which exists solely to wrap a .NET integer and provide a ToString() method. We can now try to perform a very basic manipulation of this type

Microsoft Maestro Takes Its Bow
Barbara Darrow
Microsoft is now ready to chat about Maestro, its heretofore secret realtime analytics server

Wowee Zowee
Direct Reports
Reporting Services Service Pack 2 requires a reboot. Here's why

Distant Plastic Trees
Direct Reports
One of the cool new features of SQL Server 2005 Reporting Services is multi-valued parameters. SQL Server 2000 Reporting Services only allowed you to have a single value for a parameter. For example, if your parameter was color, you could pick red or blue but not red and blue. Using expression-based queries, you could get around this limitation with by passing in the delimited strings as a string parameter. However, if you weren't careful with parameter validation, this could leave your report open to a SQL injection attack. With SQL Server 2005 Reporting Services, you can simply mark the parameter as multi-valued in the Report Parameters dialog

Isn't Anything
Direct Reports
For those of you who were on the beta of Reporting Services SP2, when you upgrade to the final relase you may have a problem when using the new client print functionality. The issue is that we did not change the CLSID of the print control object between the beta and the final build. The control will complain about not being able to load resource strings and the text in the print dialog will not appear correctly. To work around this, you need to delete the "RSClientPrint Class" from the "%Windir%\Downloaded Program Files" directory. The next time you launch the print functionality, the correct version of the control will be downloaded

Unknown Country
Direct Reports
There is an oversight in the documentation of the new client print functionality of Reporting Services Service Pack 2. SP2 has a new behavior that extracts the page size and margin information at report publishing time so that it can be used by the print control. If you look at the Property column of the Catalog table in the ReportServer database, you will see the following on a report published with SP2

Using Reporting Services SharePoint Web Parts in SQL Server 2000 Reporting Services Service Pack 2
Brian Welcker, Kathy MacDonald
This paper describes the new SharePoint Web Parts included with SQL Server 2000 Reporting Services Service Pack 2. It includes information about installing the Web Parts, ways to use them, features they support, and troubleshooting problems. The SQL Server 2000 Report Packs are used as examples

Atomizer
Direct Reports
One of the new features of SQL Server 2005 Reporting Services is support for expression-based connection strings. Many customers have the same database schema in multiple database or servers. In SQL Server 2000 Reporting Services, it was hard to have a single report that would work against multiple data sources. Below is a sample RDL snippet for a data source with an expression-based connection string which takes the server and database names from report parameters

Brotherhood
Direct Reports
So, what's a CTP anyway? Recently, we announced that instead of releasing SQL Server 2005 Beta 3, we would move to a model of releasing multiple CTPs (Community Technology Preview) up until the final release of the product later this year. But what exactly does this mean? How is a CTP different than a beta?

Report-enable Your Web Apps with Microsoft Reporting Services
Teo Lachev
Reporting is an integral part of almost all of today's Web applications. However, while the tools to create such applications have matured, Web reporting often boils down to generating tabular reports in the form of HTML tables. Microsoft SQL Server 2000 Reporting Services fills in this gap and gives you the necessary tools to easily report-enable your Web applications. In this article, which is adapted from his book, Microsoft Reporting Services in Action, Teo Lachev shows you how to leverage Reporting Services to generate reports on the server side of your ASP.NET application

No Pocky for Kity
Direct Reports
While SQL Server Reporting Services supports a variety of built-in aggregates, one of the things that people ask for is the ability to define custom aggregates. While this is not directly supported in RDL, there is a trick you can do using the <Code> block within the report

Three Imaginary Boys
Direct Reports
If you are using the matrix control in Reporting Services, you may want to create a static heading over top of your column headers. For example

Using OpenXML
Raj Vasant
SQL Server 2000 added XML support awhile back, though it was limited in what is offered and can be a little confusing. SQL Server 2005 will add many more features and functions, but in the meantime, if you are looking to add some XML support to your 2000 server, new author Raj Vasant brings us some basics on using OpenXML

Trick for figuring out why no rows returned from a query
The SQL Doctor is In
I was working on a query the other day, trying to figure out why there were not any rows being returned. It was this ugly query with like 10 tables inner joined together coming from Analysis Services, which meaned that they were joined together using old style inner joins, which are so hard to debug (versus the new style joins where you can just comment out one join at a time. So I just turned on the actual query plan and ran the query. No rows returned, but looking at the plan I could see exactly where this occurred by looking to the right of the plan for the first extremely skinny line, telling me that 0 rows survived the join

Cascading many-to-many dimensions: unsupported scenario?
SQL BI
I already opened a bug and written on newsgroup about this specfic scenario, but I still haven't received an answer. This morning I met Chris Webb in Munich (I'm attending SQL Pass Europe) and he told me that one year ago someone of MS told him that it's an unsupported scenario. It would be good to have a comment more or less official from someone of MS, because at this time the behavious of SSAS is very disappointing (you could design a model that really give you wrong data). The following is the description of the scenario

Building a Real-Time Integrated Business Intelligence System on the Microsoft Development Platform
BearingPoint
Surveys indicate that few CEOs believe their strategies are being implemented effectively by their organizations. There can be several reasons for this. Strategies are expressed in broad terms but not translated into operational actions. Business plans are not aligned with strategic priorities. Executives focus on financial outcomes while overlooking the operational issues that drive financial results. Furthermore, few organizations have a real-time handle on their financial outlooks other than at quarter-end, which prevents them from adjusting quickly to significant changes in the environment

Checklist: Load testing -- the magic bullet for SQL Server performance tuning
Jeremy Kadlec
Few things are worse than rolling out a new application and its performance is so bad that it brings the business to a screeching halt. This is not a new phenomenon; it's a reality that I have seen from time to time. I am willing to bet that you have experienced it, too. So what is the solution for preventing these performance problems?

SQL Server backups made easy using Maintenance Plans
Greg Robidoux
If you're new to SQL Server and not sure how to set up backups, Maintenance Plans may be the way to go. Although not all of the backup functionality is available using Maintenance Plans, Microsoft has developed an easy interface to set up backups for all of your databases

SQL Server OLTP vs. data warehouse performance tuning
Jeremy Kadlec
SQL Server delivers a number of functions for small to large organizations with needs ranging from internally developed applications to third-party off-the-shelf software. With all systems, data collection, entry and analysis at some level is required to meet organizational needs

Selecting a SQL Server backup model
Greg Robidoux
In a previous tip we discussed the different recovery models SQL Server offers and decision points on which to base your recovery model selection. After you've selected the appropriate recovery model, you need to put in place the proper backup strategy to minimize data loss and downtime in case of a database failure

...And how do you expect me to audit SQL Server?
Jeremy Kadlec
Is your organization faced with pending audits resulting in long and tedious meetings to try to achieve compliance? Have you been hearing a foreign set of acronyms around the water cooler like HIPAA, SOX, GLBA and SAS 70? Have these new sets of requirements been causing consternation for your DBAs, Developers and Management? Have you been trying to figure how you are going to write the code to capture the data needed for the continuous audits in between the remainder of the company projects? With the recent release of Entegra 2.0 from Lumigent, they offer the IT organization, specifically the DBAs, a new set of capabilities to audit all of the activity in a SQL Server database with real time alerting for IT Staff and reporting capabilities for compliance teams

Microsoft 'Maestro' to Tune Up Business Performance
Peter Galli
Microsoft will release to private beta this week a new, server-based business-performance-management scorecard application, code-named "Maestro," which helps users perform deep analysis by leveraging the Microsoft Office System to build, manage and use scorecards and key performance indicators

A Fresh Look at Data Warehouse Technology: Introducing the Data Warehouse Appliance
William McKnight
If you haven't heard, solutions adopting the appliance nomenclature - as in "data warehouse appliances" - have emerged as viable short-list solutions for new or refurbishing data warehouse efforts

ETL for the “Unstructured” Data Mart
Bill Lewis
This article is part of a series1 discussing the integration of iterative data (commonly known as “structured data,”) and narrative data (commonly referred to as “unstructured data”). In short, iterative data repeats, whereas narrative data tells a story

Security Tools for Development
Sergey Simakov blog
Last friday at Microsoft Moscow office Ivan Medvedev (from SWI team) made a presentation about Security Tools for Software Development. He mentioned new Threat Modeling tool, AppVerifier, PreFast, FxCop, and new Whidbey compiler switches

Microsoft's Vision for an Identity Metasystem
Microsoft Corporation
The Identity Metasystem is an interoperable architecture for digital identity that assumes people will have several digital identities based on multiple underlying technologies, implementations, and providers. Using this approach, customers will be able to continue to use their existing identity infrastructure investments, choose the identity technology that works best for them, and more easily migrate from old technologies to new technologies without sacrificing interoperability with others. This paper introduces the principles behind this approach in "The Laws of Identity"; it presents an open and interoperable architecture for building the metasystem, and it describes Microsoft's plans to participate in the identity metasystem. (12 printed pages)

Major Trends in the SQL Server Market
Brad M. McGehee
An Interview with Simon Galbraith, Marketing Director of Red Gate Software

New stored proc posted: sp_foreachroutine
adam machanic
Like sp_MSforeachtable and sp_MSforeachdb? Then you'll love sp_foreachroutine! Okay, that was cheesy. But it got the point across. Same as those undocumented MS-shipped procs, but it operates on routines (procedures/functions/views/triggers) instead

Checking the implementation model...
The SQL Doctor is In
I was fiddling about in a database working on a few things when I noticed that there was duplicated data in a column that I did not expect to have duplication in. I went to my ERwin model and sure enough, no alternate key was defined on the column in question. It was only a test database, but still, what a mess that would be if I had forgotten to put this key on in the production version. To that end I wrote a couple of queries to check for objects that did not have a primary key, or they have a primary key that is a surrogate identity value but have no UNIQUE constraints assigned

Improving SQL Full Text Search Performance
Hilary Cotter
In this article, we will look at tips to improve SQL Server 2000 Full Text Search (FTS) querying and indexing performance. I have divided this article into two sections: one on improving query performance and another on improving indexing performance

SQL Server 2005 Authentication and Regulatory Compliance
John Paul Cook
Logging in to SQL Server is so routine, it’s usually taken for granted; however, it is of critical importance. Not only is authentication the first line of defense in preventing unauthorized access, it is one of the first items auditors investigate when assessing regulatory compliance. Among the many regulations that directly or indirectly require organizations to have effective authentication mechanisms in place are

SQL Server 2005 Availability and Recovery Features
searchsqlserver.techtarget.com
A chapter download from Microsoft SQL Server 2005 New Features by Michael. This chapter discusses SQL Server 2005’s new availability and recovery features to help you better understand how they can be used to implement SQL Server 2005 in a highly available and recoverable production database environment. Microsoft Corp

Effective Strategies for Data Mining
Microsoft Corporation
Data mining is an effective set of analysis tools and techniques used in the decision support process. However, misconceptions about the role that data mining plays in decision support solutions can lead to confusion about and misuse of these tools and techniques

SQL Server 2000 Resource Kit
Microsoft Corporation
This SQL Server Resource Kit is designed for database administrators and developers who already use SQL Server and want some helpful tips, advanced techniques, and useful tools or samples to enhance their work with SQL Server 2000. It is the premier guide for deploying, managing, optimizing, and troubleshooting SQL Server 2000

Manipulating the Windows Registry Using SQL Server Extended Stored Procedures, Part 2
Gregory A. Larsen
This is the second article in my series discussing how to manipulate the Windows registry using SQL Server Extended Stored Procedures. If you are writing applications that require you to read and write to the Windows registry, this article will give you some insight into how to manipulate the registry using T-SQL. In this Article I will discuss how to enumerate the keys and values of a specific registry structure, how to deleted registry entries, and how to create and manipulate registry entries that contain multiple values

Differences between SQL-89, SQL-92 & SQL-199
Adam Machanic
Could you please tell me, in brief, the differences between SQL-89, SQL-92 and SQL-1999, and whether Oracle and SQL Server fully adhere to SQL-1999?

How to set up a QA box that closely simulates production
Steven Andres
I want to recommend that our development shop setup a QA box that closely simulates production. My manager said to put together a proposal. Right now the development box is too small and has an older version of SQL Server, plus the operating system is NT. I'm wondering how to setup the environment. Getting the machine is the simple part. Should we get MSDN Universal, install the same operating system as production (Windows 2000 server) and the same database (SQL Server 2000)? That seems like the least expensive way. What has been your experience?

Active/active or active/passive clustering?
Jeremy Kadlec
I am looking at clustering. My only concern is whether to go active/active or active/passive. I understand active/passive would basically mean that one machine won't do anything. But going active/active increases the complexity of the solution. We'd then have SQL clustering a/a, san, hba, etc. What would you suggest?

How to display the XML result of 'Select For XML Explicit'
Andrew Novick
I'm using the For XML Explicit statement in a stored procedure. I would like to see the full result set, but only get a partial result set in the Query Analyzer result window. Is there an easy way to display the XML result or save it to a file so I can verify whether the resulting XML is well formed for the eventual user of this output?

Visual FoxPro 9.0 more SQL Server friendly
Jennifer Lawinski
Database developers working with SQL Server 2000 or the soon-to-be-released Yukon might find a useful tool in Microsoft's recently released Visual FoxPro 9.0 database and desktop

A simple stored procedure to list database objects by selected type(s)
Brian Walker
In the coming months, I will present a series of tips that present SQL code for a variety of utility stored procedures. These stored procedures are intended primarily for administrative purposes, and they are designed to be installed as system stored procedures. A system stored procedure is created in the master database with a prefix of "sp_". A system stored procedure is invoked like any other, but the "sp_" prefix tells SQL Server to look for the routine in the master database before looking elsewhere. Therefore, do not use the "sp_" prefix on a typical application stored procedure. A system stored procedure executes in the context of the current database even though it exists in the master database. This behavior makes system stored procedures great for implementing generic solutions to common tasks. It's helpful to use routines like these in any database

An introduction to SQL Server 2005's CLR functions
Serdar Yegulalp
Among the many highly touted features in SQL Server 2005, the one that probably has the most relevance for programmers who work with SQL Server is the Common Language Runtime, or CLR for short. CLR allows programmers to create stored procedures, triggers, user-defined functions, aggregates and types directly inside SQL Server. CLR has a lot of promise, but it also has some pitfalls

SQL Server 2005 Express
Serdar Yegulalp
Along with the upcoming release of SQL Server 2005, Microsoft is introducing a new edition of SQL Server called SQL Server Express

SQL Server 2005 bitmap index?
Adam Machanic
Does SQL Server 2005 have a bitmap index like Oracle? No. SQL Server 2005 does not include any new index types to support relational objects

Bringing XML and SQL Server closer
Ed Tittel
Developers who work with multiple kinds of data in any kind of volume typically find they must use databases to access relational data and XML to access document data. For example, report generators, or enterprise resource planning applications, work mostly with relational databases, yet repackaging that data for interaction with suppliers or customers typically requires working with one or more XML applications

Microsoft expected to spell it out for us in '05
Margie Semilof
Microsoft will continue efforts to raise its profile in the data center in 2005, as well as define strategies such as the Dynamic Systems Initiative and Common Engineering Criteria, industry observers predict

Information Objects and Enterprise Information Integration
Actuate Corporation
The problem with data integration is that it will always be a problem. Like the perfect round of golf, the ideal data integration architecture can never really be achieved. There are too many forces, acting simultaneously, preventing IT from making this goal a reality. Everything from corporate initiatives, software purchases and upgrades, corporate acquisitions, competitive pressure, and personnel changes keep IT from delivering the perfect information platform

How to improve performance of typed DataSet creation
Uri N.
How to improve performance of typed DataSet creation

Develop a Data Quality Strategy Before Implementing a Data Warehouse
Joseph Hudicka
The importance of data quality with respect to the strategic planning of any organization cannot be stressed enough. The Data Warehousing Institute, (TDWI), in a recent report, estimates that data quality problems currently cost U.S. businesses $600 billion each year. Time and time again, however, people claim that they can’t justify the expense of a Data Quality Strategy. Others simply do not acknowledge the benefits

The Data Warehouse Foundation
Bill Inmon
The Web-based e-business environment has tremendous potential. The Web is a tremendously powerful medium for delivery of information. But there is nothing intrinsically powerful about the Web other than its ability to deliver information. In order for the Web-based e-business environment to deliver its full potential, the Web-based environment requires an infrastructure in support of its information processing needs. The infrastructure that best supports the Web is called the corporate information factory. At the center of the corporate information factory is a data warehouse

Data Warehouse Types
Bill Inmon
In the beginning of data warehouse, a data warehouse was given a simple definition. That definition was (and still is today)

Data Modeling for the Data Warehouse
Bill Inmon
In order to be effective, data warehouse developers need to show tangible results quickly. At the same time, in order to build a data warehouse properly, you need a data model. And everyone knows that data models take huge amounts of time to build. How then can you say in the same breath that a data model is needed in order to build a data warehouse and that a data warehouse should be built quickly? Aren’t those two statements completely contradictory?

A Data Modeler's Bag of Tricks
Bill Lewis
“The justification of all...models is that...they force analysts to confront possibilities that would not have occurred to them otherwise.” Sylvia Nasar, A Beautiful Mind

Terabyte Data Warehouse Table Design Choices, Part 2
David Beulke
In the first part of this article, I discussed partitioning large terabyte tables using various different keys, combinations of keys, and the important points of proper space definition for indexspaces and tablespaces. In this article, the discussion will be on design alternatives for large terabyte data warehousing tables using DB2 Version 8 for Linux, UNIX and Windows (LUW). DB2 LUW provides many design options such as Multi-Dimensional Clusters (MDC) and a UNION ALL views technique for managing maintenance and performance in your large data warehouse environment

Detecting and Resolving Stalled and Stuck I/O Issues in SQL Server 2000 SP 4
Robert Dorr
Guest columnist Robert Dorr explores how the reporting tool in SQL Server 2000 Service Pack 4 significantly reduces the time it takes to identify and determine the underlying causes of stalled and stuck I/O operations. (7 printed pages)

Using recursion in stored procedures
Arthur Fuller
The basic concept of recursion is straightforward: a given chunk of code calls itself until some boundary condition is reached. We demonstrate how to use recursion in T-SQL

Floating point fun
Ken Henderson's WebLog
I've been an advocate of using SQL Server for more than a mere datastore for about as long as I've been working with the product. Too many times, I've walked into development shops only to discover that they're using SQL Server no differently than they'd use Access or some other DBMS with far fewer features because they take a "least common denominator" approach -- only using features that most DBMSs have in common. I always believed that that was a mistake and that it was usually the result of developers not taking the time to get acquainted with the facilities and features the product offers. I've been a long-time proponent of using "what comes in the box" -- regardless of whether we're talking about SQL Server, the C++ STL, the Delphi VCL, the .NET Framework, or some other piece of the software development puzzle. For me, it just made sense to at least look at leveraging the features you get free with the development tool(s), DBMSs, and OS platform you're using rather than needlessly duplicating them in custom code. It's been my experience that the code coming out of places like Microsoft, Oracle, Borland, IBM, and their industry partners is generally of a higher quality than the average developer is capable of producing. To put it succinctly, the average developer isn't likely to do as well as a large software vendor in building the plumbing and widgets the typical app requires, and may do far worse. For me, it's just commonsense to at least take a look at using these built-in facilities before you build your own. It's hard to do that when you don't even know about them

Noise words with full text indexing on a cluster
SimonS' SQL Blog
Have just encountered a nice feature where full text indexing on a SQL 2000 cluster had different results dependant on which node SQL was running on. Primarily the noise words that the fulltext process used was different

Seven Monitoring Scripts
Joseph Sack
Joe has a bunch of small, easy to use scripts that can definitely be the beginning of a great set of monitoring tool. Amazing how much info you can gather with a little bit of code!

Sql Server script to automate the removal of logins
Raymond Lewallen
Here is another one of those files that just floats around in a SqlScripts directory on my computer. I've found this script useful many times. This was written by Clinton Herring many moons ago. This script will remove a login from Sql Server. What's so fancy about the script? Well, the script takes care of going through each database and removing this login's permissions and object ownership (at least as much as possible) so you don't have to spend time searching and searching for this owner throughout all the databases on a server. You'll want to make note of the comments Clinton has left in the header, as there is important information on the behavior and actions taken when a scenario is encountered. One such scenario is: when a login you want to remove owns a database, that database ownership is given to 'sa'

ETL: Efficiency by Design
Harry Vassilev
This article represents Data Warehouse architecture together with an optimized, non-invasive ETL strategy based on Change Data Capture technology. It ensures consistent data snapshots for any volume size as well as fast data refresh for near-real time Data Warehousing. The suggested approach can be useful also for Operational Data Store applications where data currency is critical

SQL Server Input and Output: Using HTTP with SQL Server XML Templates
Buck Woody
Security is an important part of any computing application. There is a never-ending balancing act between allowing valid users into an account while preventing invalid users from accessing the data

Visual Studio 2005 Pricing and Licensing Update
David Boschmans' Weblog
Today Sanjay Parthasarathy, Corporate Vice President Developer & Platform Evangelism, made a clear statement that Microsoft is listening to its customers by announcing to the community that it will change its SKU strategy, pricing and licensing of Visual Studio 2005

Localization in .NET 2.0: It's Still About The Architecture
Paul Ballard
Michele Leroux Bustamante presents how to architect an application in .NET 2.0 to allow for localization of resources and text in .NET 2.0. The article shows how to make use of external resources and satellite assemblies as well as the affect of cultures on strong naming and localization implications for web based applications

What the IT Decision-Maker Needs to Know About SQL Server Performance Tuning
Kurt Survance
This article is written for someone with a working knowledge of information technology, but with little specific SQL Server expertise. The typical reader would be an IT manager or executive who needs background information to properly evaluate all the performance tuning suggestions being offered to him/her before spending significant amounts of money on performance tuning

More information on backups and re-indexing
Greg Robidoux
I've inherited the backup jobs from the previous Database Administrator and they are not "normal" jobs as far as I can tell. The transaction logs are backed up hourly and the database itself is backed up completely every night after several tables? indexes are re-indexed

SQL Server 2005 - SQL Server Integration Services - Part 1
Marcin Policht
One of the SQL Server 2005 Beta 2 features that has undergone considerable architectural, functional, and interface changes, compared with its earlier versions, deals with extraction, transformation, and loading (also known as simply ETL) of data. (In other words, allowing you to move data between a variety of stores and to modify it in the process, according to your arbitrary requirements). While in SQL Server 7.0 and 2000, such capabilities have been provided by Data Transformation Services (DTS in short), Microsoft decided that the degree of modifications introduced in the new version justified re-branding it, hence its rearchitectured and significantly improved implementation has been introduced under the new name of SQL Server Integration Services (SSIS). For the same reason, rather than pointing out changes that have been implemented since the release of DTS in SQL Server 2000, in the next few articles, we will provide a comprehensive analysis of its SQL Server 2005 equivalent. For review of the SQL Server 2000 version of Data Transformation Services refer to our earlier series covering this topic and published on the Database Journal Web site).

SQL Server 2000 SP4 problems with AWE
To SQL or not to SQL
Looks to be a bit of a wild one. SQL Server 2000 SP4 introduced a bug where only 1/2 the RAM can be accessed by SQL Server when AWE is enabled. So, it you have 16GB, SQL Server will only use 8GB. There is a Microsoft KB article # 899761 that describes the problem. It's advice is to un-install SP4 and call PSS. A Hotfix for this is in progress. The SP4 download site has now a warning about not installing SP4 on systems with AWE enabled. http://www.microsoft.com/sql/downloads/2000/sp4.asp Now I wonder how SP4 got out the door with that problem!

Cool SQL Server 2005 Feature: Dynamic context help in Management Studio
The SQL Doctor is In
This is probably going to save me the most time, of all of the features of 2005. I am pretty good at T-SQL coding, but there are more settings on the SELECT statement than I can keep in my head at any given time. And remembering all of the parts of the ALTER DATABASE command to turn on SNAPSHOT ISOLATION (for a blog I am working on,) as if. So I just touch the word ALTER and about twenty things come up: ALTER SERVICE, ALTER PARTITION SCHEME, ALTER INDEX, etc. Database is down there, but clicking on DATABASE gives me: CREATE DATABASE, ALTER DATABASE, and DROP DATABASE, plus one other that is really long and I don't feel like typing it out. Bam, I have my answer, it is ALLOW_SNAPSHOT_ISOLATION, not ALLOW_SNAPSHOT_TRANSACTION. I was never actually able to get F1 or Shift-F1 or whatever the keystroke was. I usually had to type in what I wanted to search for in years past

Introduction to Many-to-Many Dimensions
sqlserveranalysisservices.com
Data structures do not always conform to the snowflake or star schema model where one fact is associated with a single dimension member. For example, in a typical cube analyzing sales data, a single sales transaction is associated with a single customer, a single product and a single point in time

CellSetGrid (beta version)
sqlserveranalysisservices.com
CellSetGrid is a simple asp.net control for browsing Analysis Services 2000 and 2005 cubes. If you're a developer, the control can be dropped into Visual Studio asp.net project. Anyone using IE can open your page and browse the data - no client side install. The control and source code are available. (Last updated 4.28.04)

Loading Aggregate Data in Yukon 2005
sqlserveranalysisservices.com
In some cases, the source data is pre-aggregated in one, some or all dimensions. In the simplest example, employee salary is loaded to all employees. Because a manager’s salary is not the aggregate of his reports, it should not be aggregated up the employee dimension. There are several other such examples. This document discusses how pre-aggregated data can be loaded directly into Analysis Services 2005

Inventory Management Calculations in SQL Server Analysis Services 2005
sqlserveranalysisservices.com
A common business problem is how to calculate inventory based on incoming and outgoing counts. At any point in time, inventory is not simply summed from the deltas because inventory on a particular day is the sum of the deltas across time. Additionally, inventory is counted on a regular basis to account for errors in deltas, shrinkage, or other discrepancies

Cell Annotations
sqlserveranalysisservices.com
A cell annotation is a note on a cell written by a user explaining something of significance. It’s common in financial applications to allow users to annotate cells during, say, the budgeting process to explain some number. This document explains how cell annotations can be incorporated into an Analysis Services 2005 application

Handling Annotation and comments in Analysis Services
Thierry D’hers, Mosha Pasumansky
This write-up summarizes several techniques for handling text information with the cube. It describes the reason for these techniques, how it is applied, their pros and cons as well as limitations

[В начало]

ФОРУМ SQL.RU

Самые популярные темы недели

Ваше мнение об упражнениях SELECT на http://sql.ipps.ru
Новые упражнения на http://sql.ipps.ru
Задай вопрос разработчику MsSQL!
Больше книг хороших и разных!
Пермиссии для доменного аккаунта, под которым стартует SQL Server Agent
Прошу голосовать за SQL.RU в конкурсе Интернить 2005
Отладчик (debugger) не работает после установки Windows XP Service Pack 2
DTS -> как мне сделать экспорт в Excell с 15 строки и другие вопросы.
Перенос баз из MSDE 7 в SQL 2000
Распределение ресурсов
Баг в [ORDER BY] на который MSSQL закрыл глаза
Доступ к измененным значениям в триггере
Хитрое ограничение
Trigger
Suspect. Attaching database has failed.
Что быстрее?
выбрать только дублирующиеся столбцы
Стоит дилема что брать: Хеон или Итаниум. посоветуйте.
Преобразование адреса KLADR в числовой код !!!
SQL 6.5. Перенос базы, без потери связей.

[В начало]

Вопросы остались без ответа

OPENDATASOURCE аварийно прекращает процедуру - можно это отменить?
что передаётся при репликации?
как поменять кодировку и все проигнорировать
как перенести базу distribution на другой сервер ??

[В начало]

АНОНСЫ

Новый конкурс на GotDotNet.ru!

По традиции в мае GotDotNet.ru проводит новый конкурс. В прошлом мае это был конкурс на знание Visual Basic.NET, главным призом в котором был автомобиль. В новом конкурсе такого большого главного приза не будет, зато будет огромное число призов, но поменьше. А именно - призами в конкурсе будут коробки Visual Studio .NET 2003 Professional Special Edition и Visual Basic .NET 2003 Standard Russian.
Предстоящий конкурс будет уникален! И его уникальность заключается в том, что приз получат все участники конкурса и сразу, а не по окончании конкурса. Подробно условия конкурса будут опубликованы в разделе конкурса на сайте, который появится в ближайшее время. Для участия в конкурсе необходимо подать заявку на участие, выбрав номинацию для участия. Предварительный список номинаций состоит из двух категорий:

  • Полезные инструменты, библиотеки и шаблоны для разработчиков (призы за участие: Visual Studio .NET 2003 Professional Special Edition).
    В эту категорию входят любые полезные инструменты, библиотеки, компоненты и шаблоны, которые могут быть использованы при разработке приложений на базе Microsoft .NET.

  • Полезные программы для домашнего хозяйства (Visual Basic .NET 2003 Standard Russian).
    В эту категорию входят настольные (Windows) приложения, работающие на базе Microsoft .NET, которые могут быть полезны для автоматизации самых разных проблем и задач, которые возникают в домашнем хозяйстве и в решении которых может помочь домашний компьютер

Обязательным условием для предоставленных на конкурс приложений является использованием ими платформы Microsoft .NET. При этом приложение может работать как локально, так и использовать сетевые соединения (p2p либо выделенный сервер, который должен быть в этом случае построен на технологии ASP.NET Web Services).

Лицензия на приложение или библиотеку, предоставляемая на участие в конкурсе, может предоставляться как бесплатно, так и за деньги. В этом случае у приложения обязательно должна быть полнофункциональная демонстрационная копия с ограничением по времени работы.

Для участия в конкурсе необходимо будет:

  1. Зарегистрироваться на сайте GotDotNet.ru.

  2. Принять условия конкурса. Основным условием участия является обязательство предоставить приложение или компонент для разработчика в течение указанного периода времени (до 1 августа 2005 года для приложений и до 1 сентября 2005 года для компонентов разработчика).

  3. Заполнить анкету участника.

  4. Разместить заявку на участие в конкурсе с описанием того, что будет предоставлено на конкурс.

В конкурсе участвуют индивидуальные разработчики. Приз (лицензия на Visual Studio .NET 2003 Professional Special Edition и Visual Basic .NET 2003 Standard Russian) выдается участнику конкурса для личного использования. От каждого участника конкурса принимается не более 1 предложения в каждой категории.

После размещения заявки на участие она выставляется на всеобщее обсуждение и оценку. Авторы предложений, набравшие наибольшее число баллов до 15 июня 2005 года получают приз за заявку. Участники, получившие приз, но не предоставившие приложение или компонент разработчика в указанный период времени, будут помещены на специальную страницу сайта ("доску позора"), откуда они будут убраны только после предоставления результата работы.

По результатам конкурса авторы лучших работ будут дополнительно награждены призами, список которых будет определен позднее. Лучшие работы будут помещены на диск, который будет распространяться по подписчикам MSDN, а также распространяться вместе с журналами (список будет определен дополнительно).

[В начало]


Вопросы, предложения, коментарии, замечания, критику и т.п. оставляйте Виталию Степаненко и Александру Гладченко в форуме: Обсуждение рассылки

СЕМИНАРЫ  КОНФЕРЕНЦИИ

МИНИФОРМА
ПОДПИСКИ



ПУБЛИКАЦИИ  АРХИВ


http://subscribe.ru/
http://subscribe.ru/feedback/
Подписан адрес:
Код этой рассылки: comp.soft.winsoft.sqlhelpyouself
Отписаться

В избранное