shutterstock_422429659

Microsoft Inspire is where industry experts come together to forge new partnerships, share ideas, and grow our businesses. That’s why Beyondsoft will be there, and we hope you will be too. Connect with our team of experts as we share our innovative solutions around predictive analytics, artificial intelligence, cloud migration, mixed reality, and mobile test automation. With our knowledge, experience, and expertise we help transform our customers’ businesses in this rapidly expanding digital economy. Please visit our booth (#1538) to learn more about these offerings:

shutterstock_266465219

 

 

 

 

 

 

 

 

Beyondsoft Advanced Predictive Experience: Our predictive analytics Software-as-a-Service built on the Azure cloud is an all-in-one solution that automates data ingestion, normalization, data storage, machine learning, and dynamic data results visualization. Our predictive analytics enabled Business Intelligence applications to provide future insights across a wide variety of business verticals: from recommendation engines to targeted marketing, automated customer support to optimization and efficiency planning, and workforce planning to fraud detection.

shutterstock_422006776

 

 

 

 

 

 

 

 

Beyondsoft Artificial Intelligence (BAI) Ops: This efficient DevOps automation tool for businesses helps resolve a high volume of false alerts, and reduce manual processing for finding root causes and rework rate. BAI Ops tracks root causes from multiple data sources, provides analysis to understand key relationships to detect and fix errors, and drives automation to achieve operational productivity. The solution is highly customizable with flexible adaptors to integrate with existing systems. Our customers who rely on BAI Ops gain cost savings in maintenance and repairs.

shutterstock_432081307

 

 

 

 

 

 

 

 


Cloud Migration:
We support our customers’ cloud journeys, both on-premises-to-cloud and cloud-to-cloud, reducing capital expenditures while delivering secure, scalable, high performance services and solutions on the cloud platform that best meets their needs. As a customer prepares for a migration effort, our cloud consulting team uses a repeatable engagement framework to lead them through a thorough Cloud-ready Assessment process consisting of an Architectural Review, a Security Review, a Performance Analysis, and a Cost Benefit Analysis. We use the findings from each phase to generate recommendations customers can use to make informed, strategic decisions before we touch a single workload. Our delivery engineers execute the migration roadmap generated during the Assessment phase, leveraging best-of-breed tools and technologies for the migration itself, and laying the proper foundation for ongoing monitoring and analytics, which are necessary to empower customers to effectively govern cloud security, compliance, and costs.

shutterstock_547463170

Beyondsoft Mixed Reality (MR) + Design, MR + Smart Building and MR + Smart City Planning apps: MR + Design enables customers to experience interior design in 3D, and provide feedback visually in real time to tighten the feedback loop. Designers can create a cohesive look that represents each customer’s style and needs. MR + Smart Building makes 3D building information management more interactive, addresses issues that arise on the fly, and provides seamless control over building management, security and energy systems. MR Smart City Planning brings a brand-new experience for government officials to develop city planning in 3D. We deploy drones to draw 3D city maps, monitor city and human traffic, air quality, and other city data. City planning teams can also create a digital copy of current city status, make changes and visualize the future digital city.

devicepass

 

 

 

 

 

 

 

 

 

Beyondsoft Devicepass: This innovative device management and testing solution brings your traditional device lab to the cloud and into your favorite browser with cutting-edge technologies:

1. End-to-end mobile automation solutions including most mainstream mobile automation frameworks, like Appium and Calabash, and a built-in, scriptless record and playback engine
2. AI-enabled app crawler smart enough to detect UI changes, for defect detection and test automation script enhancement
3. Enhanced manual testing support – control multiple devices in your browser simultaneously, or use the sync-on feature to clone manual operations across devices and advanced screenshots, or for diagnostic info management
4. Built-in CI/CD support with a Jenkins plug-in, REST API, and CLI

Devicepass provides many additional features that target your enterprise needs, including test project management, multiple tenant support, granular device utilization reports, role-based permission control, etc. Over time you’ll achieve better app quality in shorter release cycles with huge cost savings.

Beyondsoft’s presentation schedule at booth #1538

When you visit us, we’ll share how we’re transforming business models to drive digital success.

MS Inspire 2017 presentation schedule

Analysis 2

Using Microsoft’s Power BI to Implement a Management Dashboard Directly Connected to Atlassian’s JIRA APIs

 

The Beyondsoft Data & Analytics Team lives and breathes JIRA—Atlassian’s phenomenal issue and task tracking solution. We particularly love the myriad data points it captures during a ticket’s life cycle. Unfortunately, however, we’ve found its reports to be sadly lacking; this deficiency evolved from a nuisance into a major bottleneck during the team’s adoption of Scrum, when we needed to analyze how we were moving our work forward during each sprint.

During a few spare cycles I reviewed JIRA’s API documentation and found that most of the data was accessible; we just needed to put it all together. Our team brings data together for clients all the time using a variety of techniques and tools — from full scale cloud data warehouses to lightweight tabular models to everything in between — but this project seemed to call less for something bullet-proof and more for a quick, iterative approach. We needed data fast to help us improve our team’s effectiveness and answer some key questions:
• What work are we doing?
• Are we focusing on the right priorities?
• Are we neglecting any of our projects?

Enter Microsoft’s Power BI. Over the past few years this application has become our go-to tool for data visualization. Its underlying engine supports rapid prototyping that allows for mashing up data without the need for extensive supporting infrastructure. Could we quickly build a Power BI dashboard for our management team to use in reviewing and prioritizing our dev team’s progress? Spoiler alert: we could, and did. And it was awesome. Power BI surpassed all my expectations, and we used it to build a quick and dirty tool in our spare time. Here’s the story of how this came together (leaving out some details of a few dead ends).

 

Step 1: Determine if a Power BI service content pack is available

If you’re interested in integrating an alien data source into Power BI, one of the first things you should do is review whether your data source supports a Power BI service content pack. Conveniently, Atlassian has one at https://app.powerbi.com/groups/me/getdata/services/jira. Its goal is to provide an instant dashboard for visualizing workloads and velocity.
Connecting our JIRA instance’s data to Power BI was quick and easy, but the canned dashboard wouldn’t work for the particular workflow our team was focused on. It provided a simple way of viewing all tasks but didn’t break apart by project or provide deeper insights into where our team’s workflow was stalling at the sprint level.

 

Step 2: Explore your APIs

After determining the Power BI service content pack wasn’t going to provide an adequate solution, we dove into an exploration of the JIRA REST API and what it would take to pull together the view we were interested in. Our first step was to determine what data was available, so we began reviewing the Atlassian JIRA API documentation. For those following along at home we found the following resources particularly useful:
• Tutorial for beginning to explore the API: https://developer.atlassian.com/jiradev/jira-apis/jira-rest-apis/jira-rest-api-tutorials/jira-rest-api-version-2-tutorial
• Documentation for the Board-getAllBoards endpoint: https://docs.atlassian.com/jira-software/REST/cloud/#agile/1.0/board-getAllBoards

We determined that the key data streams for our team would be sprints and issues for several of our JIRA boards, and began creating test API calls to start exploring the data:
• Sprint data: https://[instanceName].atlassian.net/rest/agile/latest/board/[boardID]/sprint
• Issue data (at board level): https://[instanceName].atlassian.net/rest/agile/latest/board/[boardID]/issue
• Issue data (at sprint level): https://[instanceName].atlassian.net/rest/agile/latest/sprint/[sprintID]/issue

 

Step 3: Consume JSON within Power BI

The JSON returned was pretty much as we expected and contained all of the attributes necessary to build the dashboard we wanted to see. The next step was to consume the API data from within Power BI.
A good reference for the next steps is https://blogs.msdn.microsoft.com/charles_sterling/2016/05/25/how-to-call-rest-apis-and-parse-json-with-power-bi/. The consumption of JSON within Power BI is incredibly easy but it took a few transformations to get the data into the exact form we were looking for. We had to do the following transformation steps (raw “M” code in parentheses if you’re working with the advanced editor).
• Converted to Table: #”Converted to Table” = Record.ToTable(Source)
• Transposed Table: #”Transposed Table” = Table.Transpose(#”Converted to Table”)
• Promoted Headers: #”Promoted Headers” = Table.PromoteHeaders(#”Transposed Table”)
• Expanded values: #”Expanded values” = Table.ExpandListColumn(#”Promoted Headers”, “values”)

We repeated this for each of our three main data sources, resulting in three queries containing our basic set of data:
• Sprints
• Tickets in sprints
• Tickets

At this point we were able to do cleanup of data within Power BI: set necessary data types, remove unnecessary columns, rename columns, etc.

 

Step 4: Knit your data together into a unified model

The data we had was real but broken across several disparate sources, making it unsuitable for the analysis we hoped to perform.
As a helpful intermediary for our slicing and dicing we implemented a basic date dimension using the following DAX and the “New Table” feature in the Data tab:

Date = 
ADDCOLUMNS (
CALENDAR (DATE(2000,1,1), DATE(2025,12,31)),
"DateAsInteger", FORMAT ( [Date], "YYYYMMDD" ),
"Year", YEAR ( [Date] ),
"Monthnumber", FORMAT ( [Date], "MM" ),
"YearMonthnumber", FORMAT ( [Date], "YYYY/MM" ),
"YearMonthShort", FORMAT ( [Date], "YYYY/mmm" ),
"MonthNameShort", FORMAT ( [Date], "mmm" ),
"MonthNameLong", FORMAT ( [Date], "mmmm" ),
"DayOfWeekNumber", WEEKDAY ( [Date] ),
"DayOfWeek", FORMAT ( [Date], "dddd" ),
"DayOfWeekShort", FORMAT ( [Date], "ddd" ),
"Quarter", "Q" & FORMAT ( [Date], "Q" ),
"YearQuarter", FORMAT ( [Date], "YYYY" ) & "/Q" & FORMAT ( [Date], "Q" )
)

 

Once we had implemented the rudimentary date dimension we used that as the common join point for our three data sets.

PowerBI_Ted_Swinyar
Figure 1: Date dimension used as common join point for three data sets

 

Step 5: Visualize within Power BI

With a fully realized data model now available, we had the foundation for building a dashboard suitable for our team to view sprint progress and have data to support our post-sprint retrospective conversations.

 

Conclusion: Power BI is great for web API consumption

Our team found Microsoft’s Power BI to be a phenomenal tool for rapidly implementing a management dashboard directly connected to Atlassian’s JIRA APIs. Our team didn’t have to build any databases, ETLs, or code. We found that the main limitation of this approach is that no data is stored outside of the upstream API sources, so complex warehousing (e.g. of historical data) might require kludgy workarounds.

Designing and Documenting a RESTful API Using Swagger

 

The prevalence of RESTful APIs is visible in a vast majority of web and mobile applications that we consume and — as API developers — that we code. They serve as a common interface that can be shared between web and mobile applications, and they allow us developers to reduce the number of codebases that we must maintain to support a wider range of applications. Given the impact of APIs in the industry, developers should take care to fully engage at every stage of the development lifecycle.

One of the most commonly overlooked steps in the development lifecycle is documentation. For me, it’s a task I inevitably put off as long as possible. Yet, it’s a critical piece of an application that, when done and done well, helps drive adoption. It would be a shame to invest hundreds or even thousands of hours into developing an API only to have potential users abandon it because they aren’t able to discover its capabilities quickly and easily.

This is where the beauty of Swagger comes in. By integrating your API specification with Swagger, you get the tools to design, build and document your API. This post will cover the Swagger tools available to design and document your API specification. I’ll save the build step using Swagger Codegen for a possible future post.

 

Why Swagger?

Swagger offers flexibility for developers on several levels.

 

Widely Adopted

Swagger provides a powerful framework of tools based on the Swagger Specification. This is an open-source specification that is the basis of the OpenAPI Specification (OAS). The Swagger Specification and suite of tools have become an industry standard for describing RESTful APIs and have been adopted by millions of developers and companies.

 

Programming Language Support

Swagger supports over 25 programming languages, so unless you’re coding in some eccentric language, Swagger probably supports your choice.

 

Feature-rich Framework

Swagger provides a complete framework of free and open-source tools that enable you to design, build and document your RESTful API quickly and efficiently.

 

Design With Swagger Editor

Swagger Editor is a browser-based tool that lets you quickly design your API using a straightforward YAML syntax. As you build out your API definition with the editor in the left pane, it’s rendered for you in the right pane, letting you immediately visualize your definition and easily identify errors.

 

Figure 1

Figure 1. Swagger Editor

 

Document With Swagger UI

Swagger UI is a feature-rich toolset that allows developers and API consumers to visualize and test the API without the need to code an application just to make the necessary web requests. It has become an industry standard, meaning consumers of your API are presented with a familiar interface. This familiarity translates into less time spent searching through your documentation, and more time getting to know your API and ultimately, hopefully, a higher adoption rate for your API!

 

Figure 2

Figure 2. Swagger UI

 

Summary

I’ve covered what Swagger is and why it’s beneficial to adopt. Now I’d like to take a closer look with a real-world example.

 

An Example Using JavaScript, Node.js, and Express

In a typical RESTful API project, you might design your API in a modular way that implements a /routes folder containing a single index.js file responsible for routing your API requests to the correct controller code.

 

Node App Without Swagger

 

/** ./server.js */
app.all('/api/v1/*', [require('./api/middlewares/validateRequest')]);
app.use('/', require('./api/routes'));

/** ./api/routes/index.js */
var router = require('express').Router();
var auth = require('../controllers/auth');
var customer = require('../controllers/customer');
var account = require('../controllers/account');
var admin = require('../controllers/admin');

// Public routes
router.post('/auth', auth.login);

// Must be authenticated to access these routes
router.get('/api/v1/customer', customer.getCustomers);
router.get('/api/v1/account', account.getAccounts);

// Must be authenticated AND properly authorized to access these routes
router.get('/api/v1/admin', admin.addUser);

Code Snippet 1. A Common RESTful API Routing Pattern

 

Note that the snippet above uses a middleware to handle authentication and authorization. With Swagger, you can continue to use your middlewares and Swagger won’t get in the way.

 

A Basic Node App Using Swagger

Using Swagger UI (for this example, we’re also using the Swagger-express-mw npm package), you can configure Swagger UI in your API server code and define your API specification in your swagger.yaml file.

 

/** ./server.js **/
var SwaggerExpress = require('swagger-express-mw');
var app = require('express')();
module.exports = app;

var config = {
    appRoot: __dirname
};

SwaggerExpress.create(config, function (err, swaggerExpress) {
    if (err) { throw err; }

    // install middleware
    swaggerExpress.register(app);

    app.use(swaggerExpress.runner.swaggerTools.swaggerUi());

    var port = process.env.PORT || 10011;
    app.listen(port);
});

Code Snippet 2. Defining Swagger UI in Your API Server

 

paths:
  /hello:
    # binds a127 app logic to a route
    x-swagger-router-controller: hello_world
    get:
      description: Returns 'Hello' to the caller
      # used as the method name of the controller
      operationId: hello
      parameters:
        - name: name
          in: query
          description: The name of the person to whom to say hello
          required: false
          type: string
      responses:
        "200":
          description: Success
          schema:
            # a pointer to a definition
            $ref: "#/definitions/HelloWorldResponse"
  /swagger:
    x-swagger-pipe: swagger_raw
# complex objects have schema definitions
definitions:
  HelloWorldResponse:
    required:
      - message
    properties:
      message:
        type: string

Code Snippet 3: swagger.yaml (note: some code removed for brevity)

 

Highlights of the swagger.yaml File

The following list highlights some of the key portions of the Swagger definition.

  • /hello:
    This definition specifies how users should be routed when they make a request to this endpoint.
  • x-swagger-router-controller: hello_world
    This line tells Swagger which code file acts as the controller for this endpoint.
  • get:
    This command defines the method being requested (GET, PUT, POST, etc.).
  • operationId: hello
    This command directs Swagger to the specific method in your controller to invoke for this endpoint.
  • parameters:
    This section defines the parameters of your endpoint. They can be defined as path, query, header, formData, or body.
  • definitions:
    This section defines the structure of objects used in responses or as parameters.

 

Why Should I Care?

The beauty of this process is that you don’t need a separate routing file and API spec definition file. Just define your API in swagger.yaml and the routing AND documentation are handled for you! To make defining your API even easier, you can use Swagger Editor to have a side-by-side view of your YAML definition and the resulting API documentation. It will instantly update to highlight any syntax or definition errors.

 

Figure 3

Figure 3. Quickly Identify Errors in Your API Definition Using Swagger Editor

 

Limitations of Swagger

The Swagger community has done a great job of putting together an excellent framework that just works well. I haven’t found many limitations in my experience with Swagger, but it does have a few:

  1. The Swagger specification doesn’t support nullable data types. The consequence of this is any fields you define as nullable won’t display in your model schema in Swagger UI. This limitation doesn’t prevent the use of nullable types, but it can create some confusion when you’re trying to discover your API definition.
  2. Swagger UI is pretty similar across implementations. I know, I just touted this as a benefit, and I believe it is. However, I can imagine a scenario in which a team would like to utilize the power of Swagger, but would like more control over the UI/UX. It isn’t impossible to do, but it requires quite a bit of re-engineering to do much more than update the skin of the UI.

 

Conclusion

Get out there and try Swagger! You’ll be up and running with very little effort, and building your RESTful API, along with its supporting documentation, much more consistently and efficiently.

 

Additional Resources

 

03 Apr 2017

Self-Service ETL

shutterstock_164263613

Self-Service ETL with Power BI

 

Recently, I’ve been using Microsoft’s Power BI heavily and am impressed with this solution as an end-to-end self-service business intelligence (BI) option. In addition, I’ve really come to appreciate the possibilities that self-service BI offers.

 

What is Self-Service BI?

Gartner defines self-service BI as “end users designing and deploying their own reports and analyses within an approved and supported architecture and tools portfolio.” The Data Warehousing Institute (TDWI) describes it as “[t]he facilities within the BI environment that enable BI users to become more self-reliant and less dependent on the IT organization.”

But Forrester Research notes there’s no standard definition of self-service BI.

 

What most experts on self-service BI do agree on is the value of removing IT departments that act as a barrier to users accessing and analyzing company data. Often, the access issue can be solved, but the analysis issue is more difficult, especially if additional transformations and manipulation are needed to answer business questions.

 

What is ETL? What is Self-Service ETL?

As anyone who performs BI or data analysis will confirm, getting data into the format needed for reporting and analysis is the most time-consuming part of data projects.

 

Coding skills or the use of data integration tools like SQL Server Integration Services (SSIS) may be needed. Extracting data from a source location, transforming it into the desired format, and then loading it into a destination location is referred to as Extract, Transform, Load (ETL). This process accounts for a substantial portion of BI projects.

 

Power BI really does stand out for its ability to pull data from a variety of sources and transform it for self-service users. Power Query is the component of Power BI that provides lightweight ETL for these users. Power Query uses Microsoft’s “M” language, which can be accessed by a user interface or written directly.

 

Power Query is only lightweight when compared to enterprise tools like SSIS; the average end user should find it quite powerful, yet easy-to-use. It can:

 

  • Fetch data from many different sources through its ever-growing list of connectors (see Figure 1).
  • Be used to filter and transform data, and to create and enrich data.
  • Build a “refreshable” process for the end user.

 

Figure 1. Connectors in Power BI.

Figure 1. Connectors in Power BI.

 

 

Again, Power Query uses Microsoft’s “M” language. If the user needs more functionality than what’s exposed through the Power Query Editor user interface, there’s the option to “tweak” a query using the Advanced Editor, or to just write a brand-new query using the Blank Query option.

 

The lightweight ETL capabilities within Power BI are a powerful addition to the self-service BI toolkit. For small and mid-size businesses, where skill sets may be limited, it provides a path to obtainable BI. For enterprises, it removes the IT barrier to timely reporting and analysis needed by the business and gives flexibility for the business to perform ad-hoc analyses. For both environments, it offers rapid prototyping that can be shared to convey business requirements with internal or external BI developers.

 

In future posts I’ll show how Power Query/“M” can be used to perform self-service ETL tasks.

 

Tableau_Software_Logo_Small

In a world where data drives business decisions, great data visualization is crucial. Businesses today demand reporting dashboards that can assist with decision making and strategic planning.

Generating sleek and informative reports is not an easy task. They require careful balance between displaying a collection of information that provides real insight with having a simple, user-friendly dashboard design. To achieve this balance for my clients, I use parameters in Tableau. Dynamic visuals, strong aesthetics, and easy maintenance are just a few benefits of Tableau’s parameterized visuals.

Dynamic Visuals

Using parameters in Tableau dashboards allows end users to select a measure and level of granularity for a single visual using multiple data sets. For example, if a client requests both sales and profit values in a dashboard, there are several methods of accomplishing this.

One solution would be to create two visuals; one showing sales while the other shows profit. However, with parameters, you can have a single visual that includes both data sets with an option to choose whether to view sales or profits. We can take it a step further and provide the end user with the option to view the data for different time periods (weekly, monthly, yearly etc.).

Pavelblog_graphic 1

Aesthetics

In some cases, it can become overwhelming when a dashboard contains multiple visuals of the same type. Therefore, consolidating visuals into a dynamic visual greatly increases the aesthetics of the dashboard by providing white space and additional capacity for visuals of different types. With that, Tableau does a great job of presenting parameter values to the end user and provides several options that will fit the dashboard layout such as single value list, or a compact list, among others.

It is crucial to understand how data is visualized on the dashboard. Stakeholders will appreciate the strong, user-friendly design they can easily manipulate and modify reports from which they can gain valuable insights.

pavelblog_graphic 2

Easy Maintenance

As business requirements change, so does the reporting dashboard. Dashboards that are dynamic, agile and can change at the pace of business are easier to maintain and update. Parameterized visuals assist with this. If an additional metric needs to be displayed in the same visual style, simply add data to the parameter values. Easily-maintained dashboards decrease my development time and may drive down time I bill to the client, since the hours required for dashboard revisions is lower compared to a less-dynamic setup.

pavelblog_graphic 3

If you need more help, Tableau offers video tutorials explaining parameters and other Tableau concepts.

Are you using parameters in Tableau? Where have you seen the most value in doing so?

campeau power bi blog

SSAS Tabular and Power BI are two cornerstones of Microsoft’s new generation of BI tools, and play quite well together out of the box. For these tips, I’ll assume you have a properly connected gateway up and running linked to a tabular model in Power BI, and that the model has been properly designed so none of this will look unfamiliar to anyone who has previously worked with a pivot table.

(more…)

Cumulative-T

This blog was originally posted to the Piraeus Consulting blog on September 2nd, 2015 and may contain out-of-date information.

Depending on the insights desired, presenting data models as cumulative, rolling totals – as opposed to showing a Grand Total broken out over periods of time – can be an invaluable tool in analyzing key business decisions, spotting the difference between real trends and clear outliers, and crafting longer-term strategies.

(more…)

K-ROTH_InisightBlog-Piraeus

This blog was originally posted to the Piraeus Consulting blog on November 24th, 2015 and may contain out-of-date information.

BI in Layers

This is an exciting and difficult time in Business Intelligence. The tools available are advancing at a breathtaking rate while best practices are changing equally as fast. While the tools promise to take in data and output insights, what that process requires can be somewhat vague at times. Usually, the programs are designed to tackle one of these four layers:

  1. Raw Data
  2. Data Transformation
  3. Data Model
  4. Visualizations

Oftentimes when businesses struggle to produce results from their BI strategy it is because they try to use one tool to accomplish everything, when the tool is really designed to tackle only one of the four layers.

(more…)

PowerBIDesigner

 

This blog was originally posted to the Piraeus Consulting blog on February 22nd, 2016 and may contain out-of-date information.

For those who have followed my previous posts on the new Power BI application from Microsoft, you’ve probably already created some fantastic-looking visuals to tell the story of your data. If the art of tapping into a variety of sources, editing your dataset on the fly, and customizing your charts and graphs is old news, then let’s look at how you can manage your Power BI content in a whole new way. By this, I mean sharing your impressive visualizations through a new group workspace. Using this workspace, you can add granular membership and control permissions at the user level. While we’re at it, we will also look at how to migrate existing reports to this space and how to update and manage them through content packs.

(more…)

Tableau-Vizable-Piraeus-Consulting-2

This blog was originally posted to the Piraeus Consulting blog on March 14th, 2016 and may contain out-of-date information.

Tableau Vizable Piraeus Consulting

At their October user conference, Tableau unveiled their new visualization app: Vizable. Vizable is built for the iPad and promises to empower non-professionals to represent spreadsheets visually and to manipulate data in easy-to-use charts and graphs. Dave Story, Tableau’s Vice President of Mobile and Strategic Growth, positioned this app as a handy tool for the masses, saying, “It’s for anybody who has a tablet, data and questions.”

(more…)

Patrick-Porter-Feature-Image

This blog was originally posted to the Piraeus Consulting blog on May 31st, 2016 and may contain out-of-date information.

Recently, I attended a brown bag where one of our senior consultants, let’s call her Michelle, outlined the basics of conceptual and logical data models. I’m nowhere near Michelle’s level of technical expertise, but I still found her presentation to be both informative and intriguing. By the time it was over, I felt compelled to give myself an exercise in conceptual data modeling.

(more…)