Author Archive: Michael

Interview Question #7: Object Manager

Question

As a MicroStrategy developer, you are constantly using Object Manager to migrate objects that you develop inside the Subject Areas folder in your project. To save time, what object can you create to enable you to go straight to this folder when you open Object Manager?

A. Layout

B. Project Source

C. Script

D. Template

E. You cannot accomplish this in the Object Manager.

Answer

A. Layout

Opening multiple project sources at once in Object Manager

You may need to migrate objects between the same projects on multiple occasions. For example, you may need to move objects from your development environment to your test environment on a regular basis. Object Manager allows you to save the projects and project sources that you are logged in to as a layout. Later, instead of opening each project and project source individually, you can open the layout and automatically re-open those projects and project sources.

The default file extension for Object Manager Layout files is .omw.

To open an existing Object Manager layout

  1. From the File menu, select Open Layout. The Select Layout dialog box opens.

  2. Select a layout and click Open. A Login dialog box opens for each project source specified in the layout.

  3. For each Login dialog box, type a MicroStrategy login ID and password. The login must have the Use Object Manager privilege.

  4. Click OK. The project sources open. You are automatically logged in to the projects specified in the layout, as the user you logged into the project source with.

To save a workspace as an Object Manager layout

  1. Log in to any project sources and projects that you want to save in the layout.

  2. From the File menu, select Save Layout. The Save Layout dialog box opens.

  3. Specify a location and name for the layout file and click Save. The layout is saved.

MicroStrategy Course Where You Will Learn About This Topic

MicroStrategy Administration: Application Management Course

WIRED: A Redesigned Parking Sign So Simple That You’ll Never Get Towed

web-snow-day-1

Your car gets towed, and who do you blame? Yourself? God no, you blame that impossibly confusing parking sign. It’s a fair accusation, really. Of all the questionable communication tools our cities use, parking signs are easily among the worst offenders. There are arrows pointing every which way, ambiguous meter instructions and permit requirements. A sign will tell you that you can park until 8 am, then right below it another reading you’ll be towed. It’s easy to imagine that beyond basic tests for legibility, most of these signs have never been vetted by actual drivers.

Like most urban drivers, Nikki Sylianteng was sick of getting tickets. During her time in Los Angeles, the now Brooklyn-based designer paid the city far more than she would’ve liked to. So she began thinking about how she might be able to solve this problem through design. She realized that with just a little more focus on usability, parking signs could actually be useful. “I’m not setting out to change the entire system,” she says. “It’s just something that I thought would help frustrated drivers.” [1]

Sylianteng notes: [2]

I’ve gotten one-too-many parking tickets because I’ve misinterpreted street parking signs. The current design also poses a driving hazard as it requires drivers to slow down while trying to follow the logic of what the sign is really saying. It shouldn’t have to be this complicated.

The only questions on everyone’s minds are:
1. “Can I park here now?”
2. “Until what time?”

My strategy was to visualize the blocks of time when parking is allowed and not allowed. I kept everything else the same – the colors and the form factor – as my intention with this redesign is to show how big a difference a thoughtful, though conservative and low budget, approach can make in terms of time and stress saved for the driver. I tried to stay mindful of the constraints that a large organization like the Department of Transportation must face for a seemingly small change such as this.

01 two-step

The sign has undergone multiple iterations, but the most recent features a parking schedule that shows a whole 24 hours for every day of the week. The times you can park are marked by blocks of green, the times you can’t are blocked in a candy-striped red and white. It’s totally stripped down, almost to the point of being confusing itself. But Sylianteng says there’s really no need for the extraneous detailed information we’ve become accustomed to. “Parking signs are trying to communicate very accurately what the rules actually are,” she says. “I’ve never looked at a sign and felt like there was any value in knowing why I couldn’t park. These designs don’t say why, but the ‘what’ is very clear.”

Sylianteng’s design still has a way to go. First, there’s the issue of color blindness, a factor she’s keenly aware of. The red and green are part of the legacy design from current signs, but she says it’s likely she’d ultimately change the colors to something more universal like blue. Then there’s the fact that urban parking is a far more complex affair than most of us care to know. There’s an entire manual on parking regulations; and Sylianteng’s design does gloss over rules concerning different types of vehicles and space parameters indicating where people can park. She’s working on ways to incorporate all of that without reverting back to the information overload she was trying to avoid in the first place. [1]

redesigned-parking-inline2

Sylianteng also posted on her blog an illustration of the problem in terms of biocost, as part of her Cybernetics class with Paul Pangaro. [2]

Biocost_ParkingSign

Sylianteng has been going around Manhattan and Brooklyn hanging up rogue revamped parking signs. “A friend of mine called it functional graffiti,” she says. She’ll stick a laminated version right below the city-approved version and ask drivers to leave comments. In that way, Sylianteng’s design is still a ways away from being a reality, but so far, she’s gotten pretty good feedback. “One person wrote: ‘The is awesome. The mayor should hire you.’” [1]

————————————————————————

Sources:

[1] Liz Stinson, A Redesigned Parking Sign So Simple That You’ll Never Get Towed, Wired, July 15, 2014, http://www.wired.com/2014/07/a-redesigned-parking-sign-so-simple-youll-never-get-towed-again.

[2] Nikki Sylianteng, blog, http://nikkisylianteng.com/project/parking-sign-redesign/.

Michael Saylor Keynote – MicroStrategy World Barcelona – July 2014

Click on image to watch keynote video

Click on image to watch keynote video

MicroStrategy Simplifies Product Packaging to Enhance Total Customer Experience

New MicroStrategy Pricing

Converges on Four Products; Offers Free Upgrades to Premium Capabilities for Existing Clients

BARCELONA, Spain, July 8, 2014 – MicroStrategy® Incorporated (Nasdaq: MSTR), a leading worldwide provider of enterprise software platforms, today announced a new packaging structure aimed at delivering the best end-to-end customer and partner experience, making it easier than ever to acquire, deploy, and succeed with MicroStrategy. MicroStrategy also announced that it is extending free upgrades for existing clients to the premium capabilities included in the new product packaging, offering greater value to clients and new users.

The packaging changes will empower new and existing MicroStrategy clients to realize the full potential of their analytical applications using the most comprehensive analytics and mobile platforms in the industry. This information, and more, can be found at: www.microstrategy.com/experience.

“Our new packaging makes it simple for organizations to choose MicroStrategy for the totality of their business analytics and mobile application needs,” said Paul Zolfaghari, President, MicroStrategy Incorporated. “We believe it instantly enhances our value to existing customers and is emblematic of our heightened focus on delivering a positive customer experience. The new packaging allows better preparation and planning for new deployments, providing more value over the broad range of solutions we offer.”

Under the new packaging structure, MicroStrategy’s full feature set, previously split into 21 discrete offerings, has been reduced to four simple packages that empower developers, analysts, power users, and consumers to take advantage of the comprehensive MicroStrategy platform with simplified value-based pricing.

“From a customer perspective this is a welcome change,” said Andrew Young, BI Director, at Bob Evans Farms, a MicroStrategy client. “Budgeting and planning new applications will be far easier, especially breaking down platform investments to our business customers. With the simplified offering and pricing structure we can paint a more complete picture and focus on the business value.”

MicroStrategy added that the new packaging allows clients to more affordably deploy the full breadth of MicroStrategy capabilities (including data federation, write-back, closed-loop analysis, and automated report distribution, among others) to more users across the enterprise, giving end users full authoring capabilities as needed and integration with Microsoft® Office® applications. System architects gain efficiencies with the full ability to manage upgrades, migrations, and data loads, as well as free server administration and monitoring features. Within the four product offerings, clients will have all styles of analytics (self-service, dashboards, advanced analytics) across any interface (web, mobile, pdf, email report distribution) at Big Data scale—on an automated platform.

“To maximize the value of a customer relationship requires that companies simplify the pricing to ensure the purchasing process of technology is easy and transparent,” said Mark Smith, CEO and Chief Research Officer at Ventana Research. “The new MicroStrategy packaging and pricing enable the best possible customer experience, while shortening the time to gain full value from technology for organizations.”

The new packaging focuses on user roles within an enterprise:

  • MicroStrategy Server™ benefits all user roles. It includes a fully-featured server infrastructure designed to connect to multiple data sources, supports all major analytic styles from report distribution to information-driven apps to self-service data discovery, and scales to hundreds of thousands of users. It also includes administration and monitoring tools needed by organizations to effectively and efficiently manage their deployments.
  • MicroStrategy Web™ empowers business users to consume, author, and design analytics through an intuitive web-based interface. Business analysts can use MicroStrategy Web to take advantage of the all-inclusive set of self-service analytic capabilities.
  • MicroStrategy Mobile™, the award-winning, industry-leading interface for Apple iOS and Android devices, is an easy, fast, affordable way to mobilize analytics and information-driven applications to an increasingly mobile and 24 x 7 workforce.
  • MicroStrategy Architect™ provides developers with an extensive set of development, deployment and migration tools needed to efficiently manage the application development lifecycle.

The Company also noted that these four offerings complement the free MicroStrategy Analytics Desktop™ it made available last year. MicroStrategy Analytics Desktop is a free self-service business analytics tool designed to enable any individual user to gain deep insight into the user’s data by effortlessly creating powerful, insightful visualizations and dashboards.

MicroStrategy Report Optimization: Computational Distance

Computational Distance

Source: MicroStrategy University, Deploying MicroStrategy High Performance BI, V9.3.1, MicroStrategy, Inc. September, 2013.

Computational Distance

Any BI system consist of a series of processes and tools that take raw data at the very bottom-at the transaction level in a database-and by using various technologies transform that data into the finished answer that the user needs. At every step along the way, some kind of processing is done in the following components-the database, network, BI application, or the browser.

The concept of “computational distance” refers to the length in terms of systems, transformations, and other processes that the data must undergo from its lowest level, all the way to being rendered on a browser as shown in the image above.

The longer the computational distance is for a given report, the longer it will take to execute and render. The preceding image shows a hypothetical example of a report that runs in 40 seconds. Each processing step on that report, such as aggregation, formatting, and rendering, adds to the report’s computational distance, increasing the report overall execution time.

Reducing the Computational Distance of a Report

Computation distance offers a useful framework from a performance optimization perspective because it tells us that to improve the performance of a report or dashboard, you must focus on reducing its overall computational distance. The following image shows different techniques such as caching, cubes, and aggregation that can be used to optimize performance for the 40 second hypothetical report.

In the next blog post, we will next look at two key computational distance reduction techniques offered in the MicroStrategy platform-caching and Intelligent Cubes.

Reducing the Computational Distance

Steve Heller, Alberto Cairo, and The World in Terms of General Motors

World in Terms of GM Cutout

Readers:

The other day on Twitter, Albert Cairo tweeted about a great visual map he found in a 1938 issue of Fortune Magazine at Steve Heller’s Moving Sale on Saturday, June 28th, 2014 in New York City.

Alberto Cairo GM Tweet

Daily Heller Moving Sale

Steve Heller

Steve HellerSteven Heller wears many hats (in addition to the New York Yankees): For 33 years he was an art director at the New York Times, originally on the OpEd Page and for almost 30 of those years with the New York Times Book Review. Currently, he is co-chair of the MFA Designer as Author Department, Special Consultant to the President of SVA for New Programs, and writes the Visuals column for the New York Times Book Review.

He is the co-founder and co-chair (with Lita Talarico) of the MFA Designer as Author program at the School of Visual Arts, New York, where he lectures on the history of graphic design. Prior to this, he lectured for 14 years on the history of illustration in the MFA Illustration as Visual Essay program at the School of Visual arts. He also was director for ten years of SVA’s Modernism & Eclecticism: A History of American Graphic Design symposiums.

The World in Terms of General Motors

The visual in the December 1938 issue of Fortune Magazine was called The World in Terms of General Motors. It depicted a sketch map showing the location of (then) GM’s 110 plants. The spheres representing each plant are proportional (in volume) to their normal number of workers. The key numbers of the spheres are indexed on the map. The map does not include those manufacturing plants in which GM has less than 50% stock. The principal ones are Ethyl Gasoline Corp., Bendix Aviation Corp., Kinetic Chemicals, Inc., and North American Aviation, Inc.

Not shown are GM’s many non-manufacturing interests, domestic warehouses, etc.

So, finally, here is the complete map.

Enjoy!

Michael

[Click on the map image to enlarge]

IMG_0354

 

MicroStrategy Report Optimization: Components of Performance

Readers:

Today, I am adding the second post in my MicroStrategy Report Optimization series. This will be a multi-part series (I will leave it open-ended so I can continue to add to it).

Today, we will look at the components that comprise performance.

Best Regards,

Michael

Source: MicroStrategy University, Deploying MicroStrategy High Performance BI, V9.3.1, MicroStrategy, Inc. September, 2013.

Components of Performance

There are five key layers or components that a typical BI query must go through. They are:

  • Caching Options
  • Data Transfer
  • System Architecture and Configuration
  • Client rendering or Data Presentation
  • Data Warehouse Access

The components above are not listed in any specific order of access during the execution of a query. The image below illustrates the five components.

The Components of High Performance

Caching and Intelligent Cubes

MicroStrategy’s memory technology is engineered to meet the increased demand for higher BI performance, which is driven by the rapid expansion of both data volumes and the number of BI users in organizations across industries. MicroStrategy accelerates performance by pre-calculating computations and placing the results into its memory acceleration engine to dramatically improve real-time query performance.

Data Transfer

Data transfer over one or more networks are a very important component of a BI implementation. A slow or poorly tuned network performance in any of those transfers will translate into poor performance from a report or a dashboard execution perspective.

System Architecture and Configuration

Successful BI applications accelerate user adoption and enhance productivity, resulting in demand for more users, data, and reports. MicroStrategy provides the ability to adapt quickly to constant changes and evolve along with business requirements. MicroStrategy Intelligence Server hs been proven in real-world scenarios to deliver the highest performance at scale with the fewest servers and minimum IT overhead.

Data Presentation

Dashboards provide graphical, executive views into KPIs, enabling quick business insights. MicroStrategy enables higher performing dashboards, averaging 30-45% faster execution and interactivity. Using new compression methods, MicroStrategy dashboards have a smaller footprint than ever before – up to to 55% smaller – resulting in faster delivery using less network bandwidth. Dashboards deliver ever more analysis and data for end-users.

Data Warehouse Access

High performance BI starts with optimizing SQL queries to retrieve results from the database as quickly as possible. BI performance is dependent largely on the time that the queries take to execute in the database. An average reporting request usually takes 40 seconds to complete, out of which 34 seconds, or 85% of the query time, is spent executing in the database.

Therefore, it is critical to optimize report queries to reduce database execution time.

12 JavaScript Libraries for Data Visualization

Readers:

This is from a blog post by Thomas Greco.

Thomas GrecoThomas is a web developer / graphic designer living in New York City. When Thomas isn’t striving towards front­end perfection, he enjoys hanging with friends, going to concerts, and exploring through the wilderness!

Thomas has provided twelve JavaScript frameworks that are extremely useful for data visualization. Thomas feels that a more heavy focus is being placed on JavaScript as a data visualization tool.

I tried the demos for these JavaScript frameworks and they are very impressive. I hope you enjoyed this information as much as I did.

Best regards,

Michael

Dygraphs.js

The Dygraphs.js library allows developers to create interactive charts using the X and Y axis to display powerful diagrams. The more data being parsed, the higher the functionality of the graph. That being said, Dygraphs was built for these visualizations to contain a multitude of views. For example, Dygraphs.js makes it capable to analyze separate portions of a data-set, such as specific months, in addition to the timeframe in its entirety. Also, the Dygraphs.js library is compatible across all major web browsers, and can responds to touch sensitivity, making it a thoroughougly solid choice as a data visualization framework.

D3.js

Eventually becoming the successor to Protovis.js, D3 is capable of creating stunning graphics via dynamically updating the DOM. An acronym for Data-Driven Document, D3.js makes use of chained methods when scripting visualizations, subsequently creating dynamic code that is also reusable. Due to its reliance on the DOM, D3 has been created in accordance with W3C web standards so that the library may render correctly across web browsers. Lastly, D3′s path generator function, defined as d3.svg.line(), gives developers the capability to produce a handful of SVGs by defining different paths, and their properties.

InfoVis

Commonly referred to as InfoVis, the JavaScript InfoVis Toolkit (JIT) also earned its stripes as a JavaScript library for data visualization. Equipped with WebGL support, InfoVis has been trusted by names like Mozilla and AlJazeera, showing its solidarity as a visualization tool. Along with the D3 framework, InfoVis also makes use of chained methods to manipulate the DOM, making it a reliable library for developers of any skill set.

The Google Visualization API

Hailing from the Google Developers Console (GDC), Google’s Visualization API can be called with barely any code. In addition to easy DOM modification, this Google API makes it easy for its user to easily define custom modifier functions that can then be placed into custom groups. Furthermore, this interface’s usability, matched with its support from the GDC’s open source network, place it among the top of the list of data visualization tools.

Springy.js

Springy.js is a JavaScript library that relies on an algorithm to create force-directed graphs, resulting in nodes reacting in a spring-like manner on the web page. Although Springy.js comes configured with a predefined algorithm, options such as spring stiffness and damping can easily be passed as parameters. Springy.js was developed by Dennis Hotson as a library for developers to build off of – a fact that he makes clear.

Polymaps.js

Polymaps.js makes use of SVGs to generate interactive web maps with cross browser compatibility in mind. At the heart of Polymaps lies vector tiles, which help ensure both optimal load speeds and optimal zoom functionality. Although it may come configured with components, Polymaps.js is easily customized, and is able to read data in the form of vector geometry, GeoJSON Files, and more. Check out the graph below of the U.S. created by the U.S. Census borough.

Dimple

This past January, the Dimple API was developed so that analysts at Align-Alytics could develop strong data visualizations without having to possess much development knowledge. That being said, Dimple makes it easy for anyone, analyst or not, to develop stunning, three dimensional graphics without any real JavaScript training. Moreover, dimplejs.org displays several demonstrations, which can be easily manipulated by one’s personal data to render a graph with the same configuration, but different values. So, if you, or anyone you know is trying to segway into the depths of JavaScript, then these examples are perfect for beginners to vist and poke around.

Sigma.js

For people looking to build highly advanced line graphs, Sigma.js provides an unbelievable amount of interactive settings inside its library, and also within its plug-ins. Hailing a motto that states “Dedicated to Graph Drawing”, those developing using Sigma.js cannot help but feel like they have chosen a reliable library to work with. Moreover, Sigma’s developers encourage people to re-configure this library and create plug-ins, which has resulted in a large open-source network. Having said all that, I was extremely pleased with various aspects of Sigma, and it is among my favorite libraries for creating graphical representations in JavaScript.

Raphael.js

The Raphael.js library was created with an emphasis on browser compatibility. The framework follows the SVG W3C Recommendation, which is a set of standards that ensure images are completely scalable and without pixelation. In addition to the use of SVGs, Raphael.js even reverts to the Vector Model Language (VML) if rendered in Internet Explorer browsers prior to IE9. Although VML is very rarely used today, the support for it does a great job of showing the attention to detail that the Raphael.js team placed on this project when developing the library.

gRaphaël

Although Raphael.js is a library used to for the creation of SVGs, it was not built with a total focus on the representation of large datasets. In turn, the gRaphaël JavaScript library was created. Weighing in at a mere 10KB, gRaphaël.js has proven to be a worthy extension to Raphael.js. Although it may have not been developed behind things like a force-driven algorithm, nor does it come pre-configured with any physics properties, gRaphaël is still a well respected library for reasons ranging from its cross-compatible SVG structure, to its ease of use. As long as it coincides with the task at hand, I believe that gRaphaël.js should always be looked at as a viable resource to complete a project.

Leaflet

Whether developing for a smartphone, tablet, or desktop, the Leaflet JavaScript library has ranked atop the list of interactive mapping libraries for several reasons. Lead by the founder of MapBox, Vladimir Agafonkin, the Leaflets team of developers worked to create a library “designed with simplicity, performance, and usability in mind.” Along with Polymaps, Leaflet shares the ability to render SVG pattens via vector tiles, however only Leaflet has been developed to support Retina display. Furthermore, Leaflet can interpret various forms of data such as GeoJSON, making it perfect for a number of tasks.

Ember Charts

For those who already use the juggernaut that is Ember.js, the developers at Addepar Open Source have created a few add-on libraries to extend the Ember experience: Ember Table, Ember Widgets, and Ember Charts. A child of Ember.js and D3.js, Ember Charts utilizes the properties of flat-design. Although limited, the library does have a handful of options that deal with properties such as color and size, making it fairly simple to create impressive visualizations. Nonetheless, Ember’s presence in the front end could really help Ember Chart’s popularity in the future.

Stephen Few: Why Do We Visualize Quantitative Data?

Readers:

Stephen_FewIt has been a while since I have discussed some of the latest creative thoughts on data visualization from Stephen Few. I have read all of Steve’s books, attended several classes from him, and religiously follow his blog and newsletter on his website, Perceptual Edge.

For those of you who don’t know, Stephen Few is the Founder & Principal of Perceptual Edge. Perceptual Edge, founded in 2003, is a consultancy that was established to help organizations learn to design simple information displays for effective analysis and communication.

Steve has stated that his company will probably always be a company of one or two people, which is the perfect size for him. With 25 years of experience as an innovator, consultant, and educator in the fields of business intelligence and information design, he is now considered the leading expert in data visualization for data sense-making and communication.

Steve writes a quarterly Visual Business Intelligence Newsletter, speaks and teaches internationally, and provides design consulting. In 2004, he wrote the first comprehensive and practical guide to business graphics entitled Show Me the Numbers, now in its second edition. In 2006, he wrote the first and only guide to the visual design of dashboards, entitled Information Dashboard Design, also now in its second edition. In 2009, he wrote the first introduction for non-statisticians to visual data analysis, entitled Now You See It.

Here is his latest thoughts from his newsletter.

Best regards,

Michael

 

Why Do We Visualize Quantitative Data?

Per Stephen Few, we visualize quantitative data to perform three fundamental tasks in an effort to achieve three essential goals:

Web

These three tasks are so fundamental to data visualization, Steve used them to define the term, as follows:

Data visualization is the use of visual representations to explore, make sense of, and communicate data.

Steve poses the question of why is it that we must sometimes use graphical displays to perform these tasks rather than other forms of representation? Why not always express values as numbers in tables? Why express them visually rather than audibly?

Essentially, there is only one good reason to express quantitative data visually: some features of quantitative data can be best perceived and understood, and some quantitative tasks can be best performed, when values are displayed graphically. This is so because of the ways our brains work. Vision is by far our dominant sense. We have evolved to perform many data sensing and processing tasks visually. This has been so since the days of our earliest ancestors who survived and learned to thrive on the African savannah. What visual perception evolved to do especially well, it can do faster and better than the conscious thinking parts of our brains. Data exploration, sensemaking, and communication should always involve an intimate collaboration between seeing and thinking (i.e., visual thinking).

Despite this essential reason for visualizing data, people often do it for reasons that are misguided. Steve dispels a few common myths about data visualization.

Myth #1: We visualize data because some people are visual learners.

While it is true that some people have greater visual thinking abilities than others and that some people have a greater interest in images than others, all people with normal perceptual abilities are predominantly visual. Everyone benefits from data visualization, whether they consider themselves visual learners or not, including those who prefer numbers.

Myth #2: We visualize data for people who have difficulty understanding numbers.

While it is true that some people are more comfortable with quantitative concepts and mathematics than others, even the brightest mathematicians benefit from seeing quantitative information displayed visually. Data visualization is not a dumbed-down expression of quantitative concepts.

Myth #3: We visualize data to grab people’s attention with eye-catching but inevitably less informative displays.

Visualizations don’t need to be dumbed down to be engaging. It isn’t necessary to sacrifice content in lieu of appearance. Data can always be displayed in ways that are optimally informative, pleasing to the eye, and engaging. To engage with a data display without being well-informed of something useful is a waste.

Myth #4: The best data visualizers are those who have been trained in graphic arts.

While training in graphic arts can be useful, it is much more important to understand the data and be trained in visual thinking and communication. Graphic arts training that focuses on marketing (i.e., persuading people to buy or do something through manipulation) and artistry rather than communication can actually get in the way of effective data visualization.

Myth #5: Graphics provide the best means of telling stories contained in data.

While it is true that graphics are often useful and sometimes even essential for data-based storytelling, it isn’t storytelling itself that demands graphics. Much of storytelling is best expressed in words and numbers rather than images. Graphics are useful for storytelling because some features of data are best understood by our brains when they’re presented visually.

We visualize data because the human brain can perceive particular quantitative features and perform particular quantitative tasks most effectively when the data is expressed graphically. Visual data processing provides optimal support for the following:

1. Seeing the big picture

Graphs reveal the big picture: an overview of a data set. An overview summarizes the data’s essential characteristics, from which we can discern what’s routine vs. exceptional.

The series of three bar graphs below provides an overview of the opinions that 15 countries had about America in 2004, not long after the events of 9/11 and the military campaigns that followed.

graph-of-country-opinions

Steve first discovered this information in the following form on the website of PBS:

table-of-country-opinions

Based on this table of numbers, he had to read each value one at a time and, because working memory is limited to three or four simultaneous chunks of information at a time, he couldn’t use this display to construct and hold an overview of these countries’ opinions in his head. To solve this problem, he redisplayed this information as the three bar graphs shown above, which provided the overview that he wanted. Steve was able to use it to quickly get a sense of these countries’ opinions overall and in comparison to one another.

Bonus: Here is a link to where Steve discusses the example above on his website.

2. Easily and rapidly comparing values

Try to quickly compare the magnitudes of values using a table of numbers, such as the one shown above. You can’t, because numbers must be read one at a time and only two numbers can be compared at a time. Graphs, however, such as the bar graphs above, make it possible to see all of the values at once and to easily and rapidly compare them.

3. Seeing patterns among values

Many quantitative messages are revealed in patterns formed by sets of values. These patterns describe the nature of change through time, how values are distributed, and correlations, to name a few.

Try to construct the pattern of monthly change in either domestic or international sales for the entire year using the table below.

table-of-sales-data

Difficult, isn’t it? The line graph below, however, presents the patterns of change in a way that can be perceived immediately, without conscious effort.

graph-of-sales-data

You can thank processes that take place in your visual cortex for this. The visual cortex perceives patterns and then the conscious thinking parts of our brains make sense of them.

4. Comparing patterns

Visual representations of patterns are easy to compare. Not only can the independent patterns of domestic and international sales be easily perceived by viewing the graph above, but they can also be compared to one another to determine how they are similar and different.

In Summary

These four quantitative features and activities require visual displays. This is why we visualize quantitative data.

MicroStrategy Report Optimization: Introduction

Readers:

I have received an e-mail from a reader that wanted me to discuss how to optimize MicroStrategy Reports.

I think this is an excellent topic and will today start a new blog topic called MicroStrategy Report Optimization. This will be a multi-part series (I will leave it open-ended so I can continue to add to it).

Before we start the series, it is important to understand report execution flow in MicroStrategy. That is what I will start with today.

Best Regards,

Michael

Source: MicroStrategy University, Deploying MicroStrategy High Performance BI, V9.3.1, MicroStrategy, Inc. September, 2013.

Report Execution Flow

To better understand how some of the components affect the report performance in a MicroStrategy environment, it is important to become familiar with the report execution flow.

The following image depicts the different steps involved in the report execution process.

Report Execution Flow

The following steps and components are involved in the report execution flow.

Report Execution Flow Steps

 

Follow

Get every new post delivered to your Inbox.

Join 74 other followers