Welcome to Oleg Kapustin’s sandbox!

Oleg Kapustin
This is my personal playing ground where I keep my toys, build my sand castles and share my musings about technology, IT and software development processes and anything that excites me.

To learn more about me please visit the “About Me” page.

Please note: that I do not plan to publish any technology-related posts in any foreseeable future. It is very likely that all such posts will be moved into archive during year 2017.

If you came here from the search engine, most probably you’re interested in one of this things:

System Center Operations Manager

SQL Server & Reporting Services

If you do not see anything interesting, don’t worry, just browse this place and you will find something there. Also, the category tree is always available on your left. Should you think that this blog needs an article about something cool, don’t hesitate to contact me.

With kind regards,
O.K.

 

SQL Server 2016 – installation process notes

It is the usual thing in the tech world to start any product evaluation with the installation of the product and composing a blog post about the installation experience. We wait for new product releases, expect new awesome features and, sometimes, just “cannot wait”. I have been somewhat SQL Server addicted for a long time (more than 15 years so far), and of course, could not restraint from diving into new features. So, today I’m sharing my notes on most noticeable things that grabbed my attention at the very start of my journey – SQL Server 2016 Installation Process.

1. SQL Server 2016 Installation guidance

As usual, the installation process is very well documented at MSDN and TechNet (though the SQL Server section is no longer called “Books Online”), and I had no need to search for any blog posts to resolve any issues. All error and warning messages have links which take you to relevant troubleshooting articles. The only thing I can complain about – it is not obvious how to copy a link from the error dialog (hint – use Ctrl-C).

2. Tools are NOT included

This can be noted at the very start of the process – the installation center contains two extra link – one for SQL Server Management Tools and another for SQL Server Data Tools

SQL Server 2016 Installation 001

At the same time, features do not list any management tools (neither complete nor basic) in the list of shared features. As for me, I do not regret of not having them there – there is absolutely no reason to copy unnecessary bits to every single server.

SQL Server 2016 Installation 002

3. Some features are open-source dependant

Yes, they are! Considering Microsoft’s aggressive offer for those who use Oracle, this message looks a little bit surrealistic:

Rule “Oracle JRE 7 Update 51 (64-bit) or higher is required for Polybase” failed.

This computer does not have the Oracle Java SE Runtime Environment Version 7 Update 51 (64-bit) or higher installed. The Oracle Java SE Runtime Environment is software provided by a third party. Microsoft grants you no rights for such third-party software. You are responsible for and must separately locate, read and accept applicable third-party license terms. To continue, download the Oracle SE Java Runtime Environment from http://go.microsoft.com/fwlink/?LinkId=526030

4. There are manual post-install steps

I’ve got this warning because of “Advanced Analytics Extensions” feature:

Rule “Post-installation steps are required for Advanced Analytics Extensions” generated a warning.

The feature Advanced Analytics Extensions requires some post-installation steps after completing SQL Server setup. Please follow the steps outlined in the link http://go.microsoft.com/fwlink/?LinkId=724391

The guide lists several manual steps, and I skipped them for now, so this part will be covered in a separate post later.

5. Tempdb can be configured during installation – finally!

Tempdb recommended configuration and performance optimization has been an overlooked area for many years. (Here are just some old articles on that: Optimizing tempdb Performance at TechNetBrent Ozar’s TempDB Performance and Configuration). Now the installation dialog pays your attention that even new box should have more than 1 tempdb data file:

SQL Server 2016 Installation tempdb configuration 003

Last, but not least

My overall impression on SQL Server 2016 installation process is quite positive: all warnings and error messages are quite informative, the process is well documented, and the user experience hasn’t changed in comparison with previous releases. Of course, some can complain about the old-school design of installation dialogs, but for me, it really doesn’t matter.

What’s next? Cannot wait to dive into new and promising features. SQL Server 2016 Reporting Services is at the top of my list.

 

Interesting SQL Server Downloads – Week 13/2016

Microsoft gets faster and much more agile than before when it comes to interesting SQL Server downloads. There is no need to mention that many interesting tools and “side applications” will follow the announcement of SQL Server 2016. Here are just few I have spotted this week:

***

Microsoft® SQL Server® 2016 RC1 Report Builder

Click here to view original page at www.microsoft.com

Report Builder provides a productive report-authoring environment for IT professionals and power users.

***

Microsoft® SQL Server® 2016 RC1 PowerPivot® for Microsoft SharePoint® 2016

Click here to view original page at www.microsoft.com

Microsoft SQL Server 2016 PowerPivot for SharePoint 2016 extends SharePoint Server 2016 to add server-side data refresh processing, collaboration, and management support for PowerPivot workbooks.

***

Microsoft® SQL Server® 2016 RC1 Master Data Services Add-in For Microsoft® Excel®

Click here to view original page at www.microsoft.com

The Master Data Services Add-in for Excel gives multiple users the ability to update master data in a familiar tool without compromising the data’s integrity in Master Data Services.

***

I think I will keep an eye on that on a regular basis!

 

Hasn’t Big Data killed Data Warehousing Already?

Originally published in DataArt blog.

Information technology has always been full of surprisingly contradicting beliefs and every market, product or community has its own FAQ list or Top 10 Myths whitepaper. This week brought another “myth case” to my desk. Though it has been around for several years already, it is still hot. While my fellow database developers are busy completing another data warehousing project (“traditional” relational solution, by the way) for a travel firm, our marketing department approached me with the discussion of how we can define our new data warehousing offering. The question and concern was: “Hasn’t big data killed data warehousing already?”

The question seems tricky and provokes for diving into architectural details, pros and cons, which solution better supports data intake or business analytics or interactive visualisation. I have to confess, I’m not the saint, so I started with categories which the mind of database professional dictates – reads and writes efficiency, scalability, data consistency, data query technologies. The list kept growing, but was not taking me any closer to the answer. I spent some time trying to sort out differentiators for each technology, but with no success (Technology is the key word here, so remember it and continue reading.)

The reason why I failed to produce a good comparison is quite simple – my database pro’s brain assumed that the term “data warehouse” is equal to “relational data warehouse”. We know that relational data warehouse (or “traditional” data warehouse, as some marketing whitepapers say) are in fact relational databases, which host structured data. But what if we remove “relational” from the equation? What does “data warehouse” mean then? Can we have non-relational DW? Continue reading…

 

Data Warehousing: Then & Now, and What to Do with It

Originally published in DataArt blog.

Background

Data warehousing is not a new thing today. The concept was first introduced in the 1970s and its key terms “dimension” and “fact” appeared even earlier – in the 1960s. Since then, many businesses have successfully implemented and adopted various data warehouse solutions. Though they were using a great variety of technologies, processes, and ways of thinking, their goals were alike – consolidating data from scattered operational systems, making data clean and trustworthy, extracting the information, and unlocking hidden knowledge. All this was necessary to improve business decisions, to make them knowledgeable, rather than based on blind-guesses.

Many organizations from various industries – from finance to hospitality, from healthcare to gambling – leverage the benefits provided by this several decades old concept. But technologies evolve and brings new methods of data processing, new algorithms and implementations, new features and new possibilities. The amount of data available for analysis grows dramatically. The speed of communication increases. Thus businesses face new challenges – they need to cope with a highly competitive environment which is much faster than before, they need to evaluate the situation in a much more accurate manner, they cannot wait.

In recent years, a new trend in data warehousing has emerged, many companies are looking for ways to improve their existing solutions, which currently are:

  • Hard to maintain. Some base technologies are outdated and will be not supported in a matter of months, some key persons may have already left the company and (in worst cases) some custom source code was lost;
  • Slow. Well, maybe not slow, but not fast enough. Business users complain that they spend too much time on waiting for that “key report required by regulations”;
  • Not functional enough. The business community cannot proceed with “this simple kind of analyses” because “the data warehouse is not designed for that”.

Though all these reasons sound valuable and business-justified, it is needless to say that in most cases there are many people who are afraid of any changes in their data warehouse and show significant resistance. There is no surprise here, DW is considered to be the informational heart in many businesses and (we think) most people are afraid of heart surgery. (Only cyborgs without heart, we believe, do not).

It becomes very important for IT departments to show and prove that changes to the corporate data warehouse will be not surgery, but therapy; that it will be done in a qualified and controlled manner; that all actions are planned and risks are mitigated.

Continue reading…

 

Survey 2014: How do you use SCOM reporting? – Results

I started this post with the very simple thing – copied and pasted title from my previous post. And immediately got the very strange feeling of having a huge debt. Not a financial of course (which I try to avoid), by blogging and writing debt. It is weird and ridiculous to publish a survey result one year later! I had a choice of either moving the survey to the trash or confessing that my blogging debt is really huge 🙂

So, not only this post present results for the SCOM Reporting survey, but also should be considered as another official relaunch of my blogging activity.

Survey results

Do you use SCOM reporting features?

Yes, both buit-in reporting and Azure Operational Insights 17%
Yes, only buit-in reporting 77%
Yes, only Azure Operational Insights 0%
No 6%

Are you satisfied with your reporting experience?

Yes 6%
Almost 38%
Doubt 28%
Almost useless feature 8%
Noooooooooo 21%

Generic vs. Product-specific vs. Self-service?

Generic (i.e. reports that work for data collected by any MP) 28%
Product-specific (i.e. reports designed for specific MP) 26%
Self-service 45%

Which kind of report do you need most often?

Availability 70%
Most common alerts 55%
Performance/performance details 70%
Events 26%
Configuration 25%
Product specific 30%
Other 9%

Why do you need reports?

Troubleshooting 60%
Planning 62%
To make my manager happy 66%
Other 11%

How often do you use reports?

Every day 25%
Several times per week 30%
Once per week 26%
Once per month 15%
Once per quarter 4%

What are the most common reporting issues?

No data 36%
Incorrect data 13%
Incorrect aggregation 25%
Missing features 38%
Too slow 38%
Bad usability/user experience 53%
Execution errors 15%
Other 11%

Do you need a self-service reporting option for SCOM?

Yes, I want it on-premise 72%
Yes, I want in the cloud 4%
Yes, I use Azure Operational Insights already 4%
No, I don’t need that 17%
Other 4%

Your role in organization?

IT Pro (SCOM Admin) 66%
IT Pro (Not SCOM Admin) 6%
IT Manager 6%
IT Executive 6%
Consultant 15%
MP Developer 2%

Conclusion

As for me, the user perception of the SCOM reporting feature is very clearly described by numbers and, unfortunately, it is very far from “good enough” marks. However, the survey took place more than 1 year ago, so I hope that with release of SCOM 2016 and SQL Server 2016 the situation will improve. I haven’t tried these products yet, so take this as my personal expectation, not promise of any kind.

 

Extending Active Directory Schema to Store Application Configuration (+PowerShell Examples)

These days I work together with many top-notch developers, dealing with very different projects, solutions and applications (not only with SCOM management packs :)). And what I like most of all about being in this very diverse community is the variety of questions whose folks bring onto the table. Here is the most recent one:

Can we store configuration information for our application in AD? If yes, then how?

I was dealing a lot with the Exchange Server during last months, so for me the answer for the first question was obvious – yes, you can store the configuration information in the Active Directory. Finding the answer for the second question was tricky – I hadn’t enough knowledge about implementing this kind of stuff. So I made a dive into the theory and ended with a bunch of links and some samples written in PowerShell. Here it goes.

Disclaimer

All samples provided here are provided As Is. You may use them at your own risk.

Note, that any changes you apply to AD Schema are not reversible – consider testing any changes in the lab first.

All samples are written in PowerShell, so both IT Pros and developers can use them. (I do believe that professional developers can read the code in any language and are able to easily convert PowerShell samples into C#.)

Some theory

Many modern applications have multi-tier architecture. To be able to act as a whole, some application components might need to share configuration information with other ones. Exchange Server is a good example of such application – the typical deployment includes a number of servers, holding different roles (client access server, mailbox server, edge transport), distributed across entire organization.

The Active Directory suites that need in a very good way – everyone within organization can access the information when required and permitted, also AD infrastructure is usually highly available.

There are zillion ways to store the required information and it is up to software developers to decide how exactly they want to implement that. Here are some common points of consideration:

  • What information should be stored?
  • Who should be able to access the information?
  • Is that information sensitive?
  • Where do we need that information (anywhere in the forest, anywhere in the domain, selected Domain Controllers)?
  • Can we (do we want to) extend the schema?

It is impossible to discuss and illustrate all possible options in a one blog post, so let’s concentrate on just one example.

Continue reading…

 

SCOM Managed Module – Hidden Logics? Absolutely Not!

Today I had some time to look through some forum threads at System Center Central. One of discussions which captured my attention was about the bug in the discovery of SSRS Instances with underscore (_) in the instance name.

I will not write anything about the bug itself, by the end of the day it was an ugly bug. Bugs happen, some sit there for years. Believe me or not, good developers and testers blame themselves and fill unhappy when they step into the situation like this, when they failed to catch that in advance.

In this post I want to discuss another thing – a couple of sentences caught my eye. Here they are:

At least with the VB script, you could run it manually and see where it was failing.  The dll process is impossible to do anything with (as far as I know) and leaves us having to call MS every time we get one of those cryptic errors.

Indeed, many modern System Center Operations Manager Management Packs are been implemented using so called SCOM managed modules. I know that many SCOM Admins like to unseal management packs and read the source code to gain an understanding of how exactly they work, how do they suit their needs. Many people think that managed modules do limit their ability to research the logics. In my opinion, that is absolutely incorrect and here is the walk-through for reverse engineering  of MPs based on managed modules.

Theory

SCOM Managed modules are in fact .NET classes. Nothing more than that. Those .NET classes are stored in assemblies (.dll files) which are bundled together with the management pack definition into .mpb file. Luckily, a) .NET assembly has tons of metadata inside and b) .NET Framework has a feature called Reflection. There are some tools, which leverage this capabilities and can convert the assembly back to human-readable C# code. This approach is widely used by .NET developers for research and reverse engineering, which is an integral part of the profession.

Practice

1. Get the MSI for the management pack you want to explore.

2. Unpack the MSI:

msiexec /a <FullPathToMsi> /qb TARGETDIR="<FullPathToSomeDirectory>"

3. Unseal the .mpb (this can be done with various tools, MPViewer is one of them)

4. Locate the module you want to research (drill down from the discovery, through data sources, until you find the probe with “managed” implementation)

SSRS Discovery Probe Managed Implementation

5. Find the appropriate .dll

SSRS MP Assets

6. Get the reflection tool (for example – dotPeek from JetBrains – it is free)

7. Open the .dll with the reflection/decompile tool

8. Enjoy!

SSRS Discovery Probe Disassembled

Now you can browse the code, compare versions, find what and how was changed, and so on. Personally I prefer not to rely on others, especially on the huge software shop located in Redmond. Yeap, we pay money for the software, but other guys pay us money – most probably for getting things done, not for waiting.

 

SCOM Subscription Issue – Notification Contains Indexed Placeholder Instead of Value in the AlertName

Yes, that’s a very long title, but it really describes the issue that was brought to the table last week. The original question was:

All of the notification emails for these alerts only contained {2} for the Subject and Alert Name.

That was something I haven’t heard before and I decided to do a quick research. As I had no access to the Exchange lab this time, I have created a sample management pack which triggers alerts with some random data. There were two monitors: one with the “static” name for the alert message and another with “dynamic” name. Here is the code for dynamic one:

<ManagementPackFragment SchemaVersion="2.0" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
  <Monitoring>
    <Monitors>
      <UnitMonitor ID="OKSB.SubscriptionTest.Singleton.UM.RandomDataDynamic"
                   Accessibility="Public"
                   Enabled="true"
                   Target="OKSB.SubscriptionTest.Singleton"
                   ParentMonitorID="Health!System.Health.AvailabilityState"
                   Remotable="true"
                   TypeID="OKSB.SubscriptionTest.UMT.RandomData">
        <Category>Custom</Category>
        <AlertSettings AlertMessage="OKSB.SubscriptionTest.Singleton.UM.RandomDataDynamic.AlertMessage">
          <AlertOnState>Error</AlertOnState>
          <AutoResolve>true</AutoResolve>
          <AlertPriority>Normal</AlertPriority>
          <AlertSeverity>Error</AlertSeverity>
          <AlertParameters>
            <AlertParameter1>$Data/Context/Property[@Name='dt']$</AlertParameter1>
            <AlertParameter2>$Data/Context/Property[@Name='Random']$</AlertParameter2>
            <AlertParameter3>$Data/Context/Property[@Name='Title']$</AlertParameter3>
          </AlertParameters>
        </AlertSettings>
        <OperationalStates>
          <OperationalState ID="ErrorHealthState" MonitorTypeStateID="Unhealthy" HealthState="Error" />
          <OperationalState ID="SuccessHealthState" MonitorTypeStateID="Healthy" HealthState="Success" />
        </OperationalStates>
        <Configuration>
          <IntervalSeconds>180</IntervalSeconds>
          <TimeoutSeconds>600</TimeoutSeconds>
          <SyncTime/>
        </Configuration>

      </UnitMonitor>
    </Monitors>
  </Monitoring>
  <Presentation>
    <StringResources>
      <StringResource ID="OKSB.SubscriptionTest.Singleton.UM.RandomDataDynamic.AlertMessage"/>
    </StringResources>
  </Presentation>
  <LanguagePacks>
    <LanguagePack ID="ENU" IsDefault="true">
      <DisplayStrings>
        <DisplayString ElementID="OKSB.SubscriptionTest.Singleton.UM.RandomDataDynamic">
          <Name>
            OKSB Subscription Test: Random data unit monitor - dynamic title
          </Name>
        </DisplayString>
        <DisplayString ElementID="OKSB.SubscriptionTest.Singleton.UM.RandomDataDynamic.AlertMessage">
          <Name>{2} - dynamic title</Name>
          <Description>
            dt: {0}
            random: {1}
            title: {2}
          </Description>
        </DisplayString>
      </DisplayStrings>
    </LanguagePack>
  </LanguagePacks>
</ManagementPackFragment>

After that I have created a command channel and the subscription. I have passed this variable to the command:

"$Data[Default='Not Present']/Context/DataItem/AlertName$"

And the command allowed me to dump values of AlertName variables to the file. And here is what I’ve got:
Continue reading…

 

SCOM Management Pack for Exchange 2013 – Top 6 Things I Like

It was about a week ago when Microsoft announced an update for the SCOM Management Pack for Exchange 2013. This Management Pack made some buzz, frankly speaking I haven’t seen anything like that since TechEd NA 2014. If you’re curious, take a look at some of the blog posts (link, link, link, link, link, link, link). Impressive, isn’t it?

So, now when we’re done with high-level overview, let’s dive into very details and take a look at what’s under the hood.

Some basics

Download link.

Please note that this MP is NOT available from the catalog, go to the download page to get it.

Management pack includes 3 files:

Microsoft.Exchange.15.mp – contains new health model definition, as well as new rules, monitors, updated discovery workflow, folders and views. All these things are based on PowerShell scripts, which heavily use Exchange cmdlets.

Note that many rules share data sources, so if you want to play with interval parameters, try to keep the same value for all related rules. Otherwise you may occasionally affect the cook-down and get an extra monitoring footprint for your exchange servers. Considering that some E15 cmdlets are somewhat resource-intensive, this may add some extra headache for your fellow Exchange admins.

Microsoft.Exchange.15.Reports.mpb – contains report definitions. Unlike many other management packs, this MP doesn’t rely on generic reports library (except *health reports) and has its own definitions for performance and top/bottom reports. There is also one very custom report, based on custom dataset – “Top biggest mailboxes”.

Microsoft.Exchange.15.Visualization.Components.mpb – contains definitions and resources for dashboards, widgets and related components. Please note, that this file currently has a dependency on some SCOM 2012 R2 UR2 library MPs, so, if you haven’t upgraded yet, you’ll miss  dashboards with “almost a sexy look & feel” (© Marnix Wolf).

Top 6 things I like in this MP

Initially I wanted to write about “top 5” things, but failed to decide which one to exclude. Not all points are equally important for everyone, but, from my perspective, all things mentioned below are at least “interesting”.

Continue reading…