Oleg Shilovitsky posted a blog about PLM Platforms.
https://www.linkedin.com/pulse/article/20140829141304-949935-the-foundation-for-next-plm-platforms
Of course, I have to comment. In addition to the two major foundations of the current PLM platforms, these platforms were also designed to address file proliferation and the need for collaboration. CAD tools need file management. They are single user, meaning people cannot collaborate on a design and they produce a file that contains the geometry for a single part or collection of parts. PLM platforms (or in the early days EDM and PDM) was developed to address the deficiencies of CAD: file management and assembly management.
PLM has still have not solved the problem of providing control for multiple parts in one file. PLM systems do not cross file boundaries very well.
PLM has not provided very good collaboration capabilities either. Using check-in / check-out for collaboration is very inefficient and results in disjointed, unsynchronized knowledge development.
And don’t get me started on the inability to find information in PLM or the fact that so much critical product knowledge is stored on drawings and MS word documents.
The next generation PLM platforms must be a design platform. Files must be eliminated and replaced with ways to capture and manage detailed data about products and components. Geometry is just another way to describe a product. It should not be separate from the design platform.
A couple of us at Digital Equipment Corporation (DEC) designed a multi-disciplined, collaborative CAD system back in the 1980’s. The idea was that people would design with objects consisting of geometry, schematics (for electrical), data, performance models, etc. The objects would know how to interact based on solid model interactions and other conditions. The problem was the hardware and software infrastructure didn’t exist to support our design and the project never really got off the ground. If only it had...
Friday, August 29, 2014
Sunday, March 23, 2014
Is Data Engineering Needed in Product Development?
We have electrical engineering, mechanical engineering, structural, electronics, civil,etc. Each of these engineering disciplines has a specific set of formulas, principles and heuristics that govern now the engineering discipline is performed. We codify these rules to protect the general public. Would you feel safe driving over a bridge that was engineered by a person who didn't study structural engineering?
In a world where most products produce or consume data, we need to understand how to engineer the data. The impact of improperly engineered data can be just as catastrophic as that of an improperly engineered bridge. Incorrectly collecting and analyzing data can lead to improper interpretation which can lead to decisions that turn out to cause harm. Any product that generates or consumes data should have a data engineer on the team.
A data engineer is not a programmer or a data modeler (although these skills may be useful) but a person that addresses issues in designing, building, managing, and evaluating advanced data-intensive systems and applications. The data engineer studies big data techniques, information theory, entropy (the measure of uncertainty in a random variable). A good data engineer has both the ability to manipulate data and an understanding of the analytic purposes to which the data are going to be used.
There is a danger in engineering today that each engineering discipline operates independently without thinking about the product as a system as a whole. There needs to be a person on the engineering team who can “bring it all together”. A person who knows enough about each discipline to synthesize all aspect of the product, mechanical, electrical, data.
This TED talk by Marco Annuniziata, The Chief Economist at General Electric, explains why data is so important to products and my extrapolation, why data engineering is critical to product development teams.
In a world where most products produce or consume data, we need to understand how to engineer the data. The impact of improperly engineered data can be just as catastrophic as that of an improperly engineered bridge. Incorrectly collecting and analyzing data can lead to improper interpretation which can lead to decisions that turn out to cause harm. Any product that generates or consumes data should have a data engineer on the team.
A data engineer is not a programmer or a data modeler (although these skills may be useful) but a person that addresses issues in designing, building, managing, and evaluating advanced data-intensive systems and applications. The data engineer studies big data techniques, information theory, entropy (the measure of uncertainty in a random variable). A good data engineer has both the ability to manipulate data and an understanding of the analytic purposes to which the data are going to be used.
There is a danger in engineering today that each engineering discipline operates independently without thinking about the product as a system as a whole. There needs to be a person on the engineering team who can “bring it all together”. A person who knows enough about each discipline to synthesize all aspect of the product, mechanical, electrical, data.
This TED talk by Marco Annuniziata, The Chief Economist at General Electric, explains why data is so important to products and my extrapolation, why data engineering is critical to product development teams.
Wednesday, January 29, 2014
Is Your Organization Making Good Product Decisions
A recent engagement with a client has been keeping me very busy so I haven't posted in a while. I did have the opportunity to write a commentary on decision making in product development organizations.
The key takeways are:
The key takeways are:
- Product development is a continual process of making decisions that turn the unknown into the known
- Faulty decisions can have a huge impact on your company
- High quality knowledge is required for making good decisions
- Building knowledge that results in corporate wisdom requires stable communication channels
- A new breed of decision-support tools establishes stable communication channels
Here is a link to the document:
Dana
Sunday, October 27, 2013
Forcing people to enter data
The problems deploying Obamacare which is all over the news, started me thinking about PLM systems and deployments. In an article by Michelle Melkin on Electronic Medical Records, she points out that there is building resistance in forcing medical practitioners to enter data. They see the first two to five minutes that are required to enter data into electronic health records systems as a waste of time for themselves and their patients. How many people in engineering have the same view of PLM?
Forcing people to enter data when they see no value in what they are doing results in a failed technology implementation. If people do not get anything in return for entering data, they will find ways to avoid they system.
PLM systems have to deliver value to the people who have to use the system. Borrowing from software development Agile practices, people need to "opt-in "http://newtechusa.net/agile/the-agile-adoption-game/. They have to want to use the PLM system because they get value from using the system. The system must make their life easier, solve a problem, remove a barrier. Just as with electronic medical records, requiring people to enter data because management wants the data, will not work very well.
One reason people will opt-in to a PLM system is that it enables them to find things that they need. When people can find things they need, it saves them time. They see value in entering the data needed to find things. If you are implementing a PLM system, upgrading or just trying to improve adoption, concentrate on making search great.
Forcing people to enter data when they see no value in what they are doing results in a failed technology implementation. If people do not get anything in return for entering data, they will find ways to avoid they system.
PLM systems have to deliver value to the people who have to use the system. Borrowing from software development Agile practices, people need to "opt-in "http://newtechusa.net/agile/the-agile-adoption-game/. They have to want to use the PLM system because they get value from using the system. The system must make their life easier, solve a problem, remove a barrier. Just as with electronic medical records, requiring people to enter data because management wants the data, will not work very well.
One reason people will opt-in to a PLM system is that it enables them to find things that they need. When people can find things they need, it saves them time. They see value in entering the data needed to find things. If you are implementing a PLM system, upgrading or just trying to improve adoption, concentrate on making search great.
Friday, October 18, 2013
Great presentations in the Social Product Development and Collaboration track at the 2012 PI Congress / PLM Roadmap Conference
Great presentations in the Social Product Development and Collaboration track at the 2012 PI Congress / PLM Roadmap Conference last week. The attendees were really interested in how social is impacting product development and product lifecycle management (PLM). The presentations will be available for download shortly.
My presentation discussed how applying information theory concepts improves collaboration and resulting innovation. This concept is a little "out there" I know, but it does make a huge amount of sense. When an information channel is stable and low entropy, a huge amount of information, new data can be put over the channel. Innovation requires newness, change and lots of information transfer. We also discussed how to create a stable information channel using social media concepts and capabilities. For more information, you can contact me.
John Mannisto from Whirlpool Corporation presented his teams collaboration techniques. He discussed The “Commons”. It's a place where Social Layers can combine and cross over to maximize our cross-flow of knowledge. He wants to be able to “go viral” with good ideas. The “Commons” social layer is the “water cooler” where explicit and tacit knowledge can combine.
John also told us we need to adopt new tools and new behavior - we need to get our kids involved. We need to learn from outside of work, because social media is outpacing corporate collaboration. We need to combine the new-generation behavior with the old engineering. Most engineering is based on Newton, Euler, and a bunch of French Mathematicians. What continues to change is the way we execute.
Bruce Richardson from Salesforce.com gave a great presentation on the coming combination of customer relationship management (CRM), social and PLM. With 4.5 billion people connected to social networks today, social has become the new way for sharing what we are doing and what we care about in our personal lives. And for businesses, there are over 150 million daily conversions happening that are related to products and companies. People are talking about your products and your company. This is how your brand is being created. You need to be able to connect with these customers.
My presentation discussed how applying information theory concepts improves collaboration and resulting innovation. This concept is a little "out there" I know, but it does make a huge amount of sense. When an information channel is stable and low entropy, a huge amount of information, new data can be put over the channel. Innovation requires newness, change and lots of information transfer. We also discussed how to create a stable information channel using social media concepts and capabilities. For more information, you can contact me.
John Mannisto from Whirlpool Corporation presented his teams collaboration techniques. He discussed The “Commons”. It's a place where Social Layers can combine and cross over to maximize our cross-flow of knowledge. He wants to be able to “go viral” with good ideas. The “Commons” social layer is the “water cooler” where explicit and tacit knowledge can combine.
John also told us we need to adopt new tools and new behavior - we need to get our kids involved. We need to learn from outside of work, because social media is outpacing corporate collaboration. We need to combine the new-generation behavior with the old engineering. Most engineering is based on Newton, Euler, and a bunch of French Mathematicians. What continues to change is the way we execute.
Bruce Richardson from Salesforce.com gave a great presentation on the coming combination of customer relationship management (CRM), social and PLM. With 4.5 billion people connected to social networks today, social has become the new way for sharing what we are doing and what we care about in our personal lives. And for businesses, there are over 150 million daily conversions happening that are related to products and companies. People are talking about your products and your company. This is how your brand is being created. You need to be able to connect with these customers.
Saturday, September 14, 2013
Google Image Search
Imagine you are an engineer (maybe you are one) and you must find that picture you remember seeing about a failure mode. You spend hours trying to find it. Now imagine you type in a couple of words that you remember about the photo and boom! there it is! Fiction? nope. Not now. Google just announced personal photo search. in Google+
I may sound like a Google sales person but these pattern recognition and search technologies can be huge time savers. They can also save a company's butt when the engineers miss a potential problem just because they didn't find a critical piece of data. How many times have I see that? Many.
I tried searching my own photos in Google+. I searched for the word "beach". Sure enough all the pictures of beaches came up, Awesome!
Monday, September 2, 2013
PLM Cannot Deal with Flexibility
I posted this is response to a blog entry made by Oleg Shilovitsky, PLM and Data Modeling Flux
PLM systems have had a very hard time with data flexibility or data plasticity. Even more so, PLM systems do not handle hierarchical classification particularly well either. The same rigidity needed for document revisions does not work for constantly evolving product and components attributes. It's one of the reasons why I recommend a PLM solution architecture to my clients that is composed of a PLM system, search and attribute management solutions.
Think about all the rows (data) and columns (attributes) engineers manage with individual spreadsheets. The amount of time wasted in managing these characteristics about products and components is truly staggering. The number of errors and product issues due to miscommunication of attribute data has resulted in billions of losses. You would think there would be a huge economic incentive to addressing this problem!
PLM systems have had a very hard time with data flexibility or data plasticity. Even more so, PLM systems do not handle hierarchical classification particularly well either. The same rigidity needed for document revisions does not work for constantly evolving product and components attributes. It's one of the reasons why I recommend a PLM solution architecture to my clients that is composed of a PLM system, search and attribute management solutions.
Think about all the rows (data) and columns (attributes) engineers manage with individual spreadsheets. The amount of time wasted in managing these characteristics about products and components is truly staggering. The number of errors and product issues due to miscommunication of attribute data has resulted in billions of losses. You would think there would be a huge economic incentive to addressing this problem!
Subscribe to:
Comments (Atom)




