Current Blog Entries by Larry Fry, CCP, MBA

Another WordPress weblog for my Business, Energy, Investing, IT, and Travel blog entries!

Macro Level Resolution Strategies per the Ongoing Financial Systems Crisis of 2008

leave a comment »

Just a few thoughts and possible resolution strategies (in outline format) to consider per the current financial system crisis from a concerned long-term investor, citizen, and voter.  And from an Independent’s perspective, the political players from both parties don’t seem to be focusing on real solutions, just the problems and the resulting finger-pointing (or “blame game”) that occurs.  Note: In terms of presenting a macro causal analysis for this financial crises, private sector lenders issued risky home loans to buyers who could not afford them, which they then sold to Wall Street financial institutions as bundled, mortgage-backed securities that eventually became worthless with the downturn in the housing market. As a result, the heavily leveraged financial institutions became overly susceptible to the collapsing asset class, leading to a liquidity crisis, a disturbance of bank lending, and an overall global contagion.

I. Short-Term Macro Level Strategies to Mitigate Current Crisis (note that some of these strategies are now being implemented in some fashion) –

1). Continued Implementation of timely “lender-of-last-resort” measures by central banks are needed in order to mitigate the current credit crisis before it spirals completely out of control.      

1a). Immediate, large-scale damage-control measures are needed in order to restore the public’s confidence in financial markets and institutions.            

1b). Organized bailout programs by central banks across the globe are needed in order to reduce the overall amount of help and costs involved (i.e., the quicker the better in terms of lowering costs).

2). Aggressive actions required on the part of central banks and governmental agencies in order to mitigate the seizing global credit crisis include the following:

2a). Continued cutting of short-term interest rates and increasing the amounts of loans made available to banks via auctions in order to maintain fluid liquidity levels.            

2b). Force remaining lenders to extend low “teaser” rates on ARM based loans that have been given to sub-prime borrowers.           

2c). Force remaining lenders to restore partial homeowner equity to sub-prime mortgage holders having negative equity in order to forestall the foreclosure rates.           

2d). Continued swapping of Fed funds (e.g., Treasure bills, etc.) for “Level 3” securities from struggling financial institutions in order to provide them with necessary liquidity.            

2e). Permanent providing of emergency loans to the remaining investment banks, with the resulting increase in regulation that should go with it (i.e., like that required for commercial banks).           

2f). The establishment (ASAP) of an organization similar to the former Resolution Trust Corporation of the S&L days to use taxpayer funds to buy out the worst sub-prime loans.  Note: When the current credit crisis finally does ease up, central banks will then need to expedite reverse monetary actions designed to prevent the reintroduction of market speculation using “cheap” money.  

II. Long-Term Macro Level Strategies & Proactive Regulatory Measures –

1). Implementation of  stricter governmental regulations for securitized mortgage loans.            

1a). SEC directed revamping of underwriting standards to reduce the chances that credit risks will be underestimated, which in turn will reduce the overvaluing of securitized sub-prime loans.            

1b). SEC led mitigation of regulatory capital arbitrage attempts in the securitization process by enforcing the maintenance of minimum capital requirements by financial entities.            

1c). SEC led mitigation of “safety net” and risk transfer abuses in the securitization process.

2). Require that mortgage loan originators hold bigger equity positions in the securitized packages and that banks issue “covered” bonds backed by securitized mortgage loans in order to keep the risks on  their balance sheets.

2a). Increase involvement of criminal investigation agencies in the securities arena to mitigate the misrepresentation of the quality of mortgage loans in securities filings by financial entities. 

2b). Implementation of more stringent licensing requirements for mortgage brokers and tougher mortgage lending standards, including enhanced risk management practices by lenders.

2c). Implementation of more stringent disclosure and write-down requirements for financial institutions, as well as an increase in the monitoring of their “capital adequacy”.

2d). Consolidation of the different governmental finance agencies into a financial oversight “super” agency designed to rate sub-prime mortgages and financial securities per their safety levels.

2e). Establishment of well defined roles for the International Monetary Fund (IMF) in terms of its serving as a bailout agent for emerging market countries experiencing financial difficulties.            

2f). Need to mitigate any moral hazard issues that could arise due to IMF intervention (e.g., an increase in risk-taking activities by the governments of the countries being assisted, etc).            

2g). Complete banning of Alternative-A type mortgage loans, which require little or no documentation on a borrower’s wealth or income, resulting in abuses on the part of mortgage brokers.

3). Establishment of new generally accepted auditing standards (new FAS rulings) designed to force financial entities to start valuating their investment instruments (e.g., CDOs) using market based measures rather than their own pricing models, along with the reform of the credit rating agencies.

3a). Mortgage based securities should be strictly booked as “marked to market” from an accounting standpoint instead of “book” value in order to help mitigate the blatant overvaluation of the underlying collateral assets.

3b). Mortgage securities packages should be categorized and split into different “tranches” based on their differing levels of certified risk for both investing and auditing purposes.

3c). Tighter regulation of credit rating agencies (e.g., Moody’s, S&P, et al.) by the SEC and Congress.

3d). Credit rating agency reform acts are needed to mitigate the overrating of tenuous capital structures being passed as investment grade securities &d to enhance agency quality control measures.

3e). Recourse measures should include the suspending of credit rating agencies that continuously propagate inaccurate (“pumped up”) ratings due to the conflicts of interest involved with the issuers.

3f). The compensation method for credit rating agencies needs to be changed from that of being paid by the issuers of structured debt products to that of being paid by investors to eliminate conflicts of interest.

3g). Credit rating agencies should be required to decline credit rating services for exotic types of securities that have no performance records to track.

3h). Credit rating agencies should also be precluded from having exclusive access to non-public investment information.

Agency Note:  SEC chief Mary Schapiro has now called for sweeping industry changes at a roundtable meeting as credit ratings agencies have been thoroughly blasted for not warning about the risks of subprime mortgage securities. Moody’s, Standard & Poor’s, and Fitch dominate the industry: one proposal calls for a governmental ratings agency that would compete with these three firms.

3i). Central banks should have quickly executable, pre-approved contingency plans “in place” for when unexpectedly large write-downs by financial institutions occur in the future.  The immediate availability of liquidity measures and the continued identification of potential merger partners for failing financial institutions is key here.

3j). Tighter regulation of the corporate auditing function by the SEC & Congress.  If necessary, the corporate auditing function may even need to be “nationalized” to become entirely a governmental function in order to eradicate the inherent conflicts of interest that currently exist between corporations & the private auditing/consulting firms that they employ on a high-fee basis (i.e., need to reduce the risk of over-inflated equity valuations based on creative accounting measures, etc).

Note: These macro strategies were initially conceived of as part of my section of responsibility for a group project/presentation in a graduate level finance course at the University of Houston (i.e., FINA 7340 – Financial Markets & Institutions).

Click on URL to link to a recently published NYU Stern working group paper that provides excellent insights on viable, real-time solutions for financial reform:  http://govtpolicyrecs.stern.nyu.edu/docs/whitepapers_ebook_full.pdf

Written by Larry Fry, CCP, MBA

August 10, 2015 at 12:47 pm

Natural Gas Pipeline Customer-Satisfaction Critical Success Factors

leave a comment »

The following points represent the generally accepted critical success factors for natural gas pipeline companies of choice that serve to quantify the perceptions and needs of their customers based on the latest Natural Gas Pipeline Customer Surveys (2013) by research firm Mastio and Company.

01).  A pipeline company’s ‘Firm’ (i.e., guaranteed) gas transportation services must be highly reliable (i.e., guaranteed) in order to justify the higher billing rates charged for this service.
0
2).  A pipeline company’s scheduled gas volumes must always be highly accurate in nature.
03).  A pipeline company’s information systems for nominating, scheduling, allocating, balancing, and invoicing gas transportation and storage services must be highly effective and reliable.
04).  Accurate operational information must be made readily available to all of a pipeline company’s customers per FERC regulations and mandates (e.g., Info Post).
05).  A pipeline company’s account representatives must be accessible on a 24 by 7 basis in order to help resolve any customer-support related business needs in a timely, courteous manner.
06).  A pipeline company’s account representatives must also be good listeners who are capable of providing prompt, accurate responses to customer inquiries/requests based high levels of industry sector expertise.
07).  Pipeline company initiated communications with gas shippers and supply operators must be highly meaningful, informative, and of high quality.
08).  A pipeline company must be willing and able to offer its customers flexible transportation services such as gas pooling and aggregation (e.g., TABS services, etc.).
09).  A pipeline company’s billing invoices must always be accurate in nature and delivered to its customers in a consistent, timely manner.
10).  A pipeline company must maintain a strong culture of integrity, accountability, and compliance in terms of conveying ethical, honest and transparent business conduct.
11).  A pipeline company must create and maintain superior and sustainable value for its investors, customers, employees, and the communities which it serves.
12).  A pipeline company must maintain safe, reliable, and environmentally sustainable operating environments with open communications to all parties concerned.
13).  A pipeline company must stay focused on contributing to the economic, environmental and social well-being of the communities which it serves.
14).  A pipeline company must continuously deliver sector-leading value to its investors and other stakeholders.
15).  A pipeline company’s employees, executives, and board members must all be fully committed to a culture of providing superior customer service in a fair, efficient, and reliable manner.

Relevant Notes –
a).  ‘Firm’ gas transportation services are designed to reserve pipeline space and guarantee subscribing customers (e.g., hospitals; schools; homes; offices; etc.) that their specific volumes of gas will be scheduled and transported when needed at higher firm based rates.  In addition, firm transportation rates include any capital and fixed costs involved with setting up the pipeline company’s ability to provide firm transportation services, as well as any variable transportation costs involved in transporting the gas volumes across pipeline sectors and interchanges (e.g., fuel charges; reservation charges; etc.).  Finally, firm transportation rates must be paid by firm customers whether or not their scheduled gas gets shipped (i.e., sometimes gas pipelines cannot transport all nominated gas due to peak constraints, etc.).

b).  ‘Interruptible’ gas transportation services are designed to enable subscribing customers (e.g., industrial concerns; power companies; etc.) to schedule and move gas on a pipeline at a lower billable rate if/when space is available, but not if the pipeline is full with higher priority firm gas at the time.  In addition, interruptible transportation rates also enable the pipeline company to recover a smaller percentage (i.e., than firm rates) of the capital and fixed costs involved with providing the service, as well as any variable costs involved in transporting the gas volumes across pipeline sectors and interchanges (e.g., fuel charges; reservation charges; etc.).

c).  Transporting compressed natural gas through pressurized pipelines is considered to be the most efficient and economically feasible method available to the industry; other methods, such as rail and vehicular transport, are considered less desirable due to the low density nature of natural gas (i.e., as opposed to liquids). In other words, gas transportation profitability is closely tied to the volumes of gas shipped, and higher volumes of gas can be delivered to customers through pressurized pipelines due to the gas compression advantage, etc.

d).  A pipeline’s greatest value is now being able to mitigate shipping bottlenecks and serve customers with a reliable (i.e., firm) gas supply even during period of peak market demand. This is all due to the advent of flat gas price spreads resulting from the usage of fracking techniques and the resulting new geographical supply areas (e.g., Marcellus Shale, etc.). As a result, legacy pipelines can no longer profit from previously existing geographical spot price differentials, where they would receive cheaper gas in one region of the country and ship it to delivery points in another region of the country and sell it a for higher price.

e).  A pipeline’s capacity now needs to be fully contracted per firm reservation agreements for revenue purposes as gas prices can no longer be arbitraged based on geographical origin (per section d above). Lower cost production and shipping tariffs in the newly developed shale areas is displacing higher-cost production and shipping tariffs in the traditional supply areas of the country, leaving some pipeline capacity underutilized due to the resulting reduction in gas flows along the traditional west-to-east pathways. Note: The implementation of flow reversals from cheaper supply sources in the east and transported to delivery points out west is one way to mitigate this under utilization issue.

Written by Larry Fry, CCP, MBA

July 22, 2015 at 1:30 pm

Options Trading Advisory Services

with 4 comments

Looking for an options trading advisory service that provides consistently profitable option trades?  Schaeffer’s Investment Research is a very educational advisory service that also offers a gamut of different subscription services designed to enable the subscriber to mix and match different options trading styles.  I am currently a subscriber to Schaeffer’s “Super Trader” aggressive alert services package and its more moderate “Weekend Trader” bulletin service. Another good option advisory service to consider here is Option Monster, which also offers several subscription services tailored for one’s particular level of trading experience and desired level of risk.  Option Monster also has a very good education section within its website as well.  I am currently a subscriber to Option Monster’s aggressive “InsideOptions” alert service, which provides up to five Options trade ideas each week.  Schaeffer’s and Option Monster are two of the more popular options advisory subscription services out there, and they have both recommended some real winners for me in the past.  But my overall returns from both of these services have been tempered by the issuance of many additional recommendations that didn’t pan out too well (i.e., especially Schaeffer’s in the past); as a result, one needs to be aware of the overall picture here when striving to achieve a model of consistent profitability with these services.  In other words, additional technical analysis and fundamentals based research on the subscriber’s part is usually required in order to proactively eliminate most of the the additional recommendations that will not work out in order to increase one’s overall level of profitability.

In terms of specific options recommendation services that identify short-term opportunities that are about to “pop”, several alert services offered by Schaeffer’s and Option Monster immediately come to mind.  For instance, Schaeffer’s “Expiration Week Countdown”, “Overnight Trader”, and “Weekly Options Trader” alert services offer these aggressive short-term “pop” types of opportunities, as do Schaeffer’s “Event” and “Players” series.  Option Monster’s “InsideOptions” service also offers similar trade recommendations as those offered by Schaeffer’s, although Option Monster tends to curb the urge to flood the subscriber with recommendations that have not have been fully researched or scrutinized by its senior options analysts.  Furthermore, Option Monster recommendations issued by co-founder and “InsideOptions” lead analyst Pete Najarian have had the highest winning percentage (for me) of all of the advisory services mentioned here.  Pete Najarian does not issue direct recommendations too often for the “InsideOptions” service, but when he does they usually work out very well; as a result, I rate Pete Najarian as being the best of the options advisory analysts that I have been following for the past several years.  One additional options subscription service that one should consider here is the one being offered by TheStreet.com, with which the folks over at Options Monster share a common history with.  In thinking in terms of comparing these different types of option recommendation services, I actually believe that quite a few of these advisory services “borrow” from each other to a certain degree and then repackage the information gleaned for their particular ongoing marketing models.  For example, Schaeffer’s and Zack’s seem to partner-up on some of the service recommendations offered by Schaeffer’s, as do Schaeffer’s and options research group What’sTrading.  In addition, Option Monster and TheStreet.com seem to partner-up on some of their recommendations as well, which makes sense based on the above mentioned common history that they share.

In conclusion, I should reiterate that most of the popular options advisory services out there will recommend some big winners to you as a subscriber, but you’ll need to do some additional “homework” to weed out the non-winning ideas in order to make the service profitable on a consistent basis.  Again, my overall returns from the above mentioned subscription services are usually tempered with many additional ideas that do not work out, so I try to stay aware of the overall picture here in order to achieve a model of consistent profitability.  Additional technical analysis and fundamental based research on the subscriber’s part is necessary in order to proactively eliminate the additional ideas that won’t work out in order to increase one’s level of profitability.  Granted, these services will recommend a few nice winners for you per their advertisements, but you’ll definitely need to scrutinize their all of their recommendations with your own research (and closely monitor them) in order to make subscribing to the service worthwhile.  As an example of this, I have now gotten to the point with Schaeffer’s “Expiration Week Countdown” service recommendations where I weed out all but one or two of the ideas issued during options expiration week of each month.  For example, during options expiration week in August, 2010 I eliminated from consideration all of the recommendations made but one, which was one of the Netflix (NFLX) call options that was set to expire at the end of that week (Call -> NFLX -> AUG 21, 2010 -> Strike Price $110).  I selected this particular idea out of the overall group that Schaeffer’s recommended due to the recent high levels of volatility associated with NFLX’s stock price and volume at the time.  And this particular option call did very well for me that week (i.e., more than enough to cover the cost of my annual subscription).  So I have found that I am able to select one or two real winners to play with from the list of recommendations made by this particular service during options expiration week each month, and my experience has shown that the rest of the ideas are usually worth ignoring after doing the requisite research. Hope this all helps!

http://www.schaeffersresearch.com/cart/ViewAllProducts.aspx#aggressive
http://www.optionmonster.com/about/products.jsp

See link below for an enhancement of finance professor Peter Carr’s instructive paper on the implementation of the Black-Scholes call/put options pricing model on the HP-12C programmable finance calculator by Tony Hutchins. http://www.hpcc.org/datafile/V23N3/V23N3P25.pdf

Web Data Mining and Orwellian Risks for Abuse at the Private Individual Level

leave a comment »

This posting consists of my comments per a guest post published on Forbes.com by Chris Taylor, a technologist with TIBCO Software, founder of successfullworkplace.com, & fellow member of LinkedIn group ‘Disruptive Technologies’.

The ever burgeoning data explosion and the resulting technologies being developed to interpret meaningful information from it (e.g., Data Mining; interpretive/predictive analytics; etc.) are here to stay.  The competitive advantages which stand to be gained by companies and the military/security sectors of governments that can effectively glean valuable information from the morass of personal data now available on the world-wide web is immense.  Personal data that is gathered and analyzed/stored at a sector group level seems to present less of a threat to each individual’s personal privacy when used in traditional ways (e.g., company marketing studies).  But the “Orwellian” risks for abuse at the private individual level as the current data mining technologies in use become increasingly more sophisticated cannot be ignored.  Furthermore, the judicial systems of developed countries have not been keeping pace with the burgeoning privacy violation ramifications of the information revolution that is currently taking place.  Exacerbating all of this is the fact that for increased revenue purposes, social networking users are being urged by the likes of Facebook’s Mark Zuckerberg to become more transparent by revealing more of their personal information on these sites; a factor which is serving to make personal transparency in public forums a current “popular culture dynamic”.  So the genie is definitely out of the bottle here, which should behoove users of all social networking sites to become more familiar with the “primitive” privacy settings made available by the provider and use them to tailor who gets to see what parts of their personal information that gets generated as time goes by.

Finally, to effectively manage the increasingly sophisticated video parsing technologies currently being utilized by data mining entities, the use of iconic “monikers” in lieu of facial snapshots for one’s social networking sites would be the best option to use in order to remain anonymous per the analysis of video data by companies (and unscrupulous governments, where they may exist).  In addition, video and photo tagging on social networking sites should also be meticulously controlled via one’s privacy settings as well.

Link to Chris Taylor’s article “While You Slept Last Night: Big Data, Privacy, and the Public Square” – http://www.forbes.com/sites/kashmirhill/2012/08/28/while-you-slept-last-night-big-data-privacy-and-the-public-square/

Chris Taylor’s twitter handle is @Successful Work.

Interested LinkedIn readers are invited to join ‘Disruptive Technologies’ group for discussions on this & similar issues!

Status of Object-Oriented Database Management Systems?

with one comment

Being a senior business systems (IT) consultant who has always utilized hierarchical (IMS) and relational database management systems (RDBMS), I’m wondering whatever happened to the concept of object-oriented database management systems (ODBMS), which at one point (i.e., 10 years ago) was considered to be the database structure design of the future by some IT academic types for object-oriented software applications?  Do the newer data storage mechanisms and/or models currently out on the market represent various hybrid versions of ODBMS?  Or has this concept completely gone by the wayside before it ever had a chance to take off due to its being considered “too disruptive” of a technology by IT management?  My impression has always been that an ODBMS persistence structure would need to be in place in order for a true object-oriented software application to be able to make use of (and persist) intensive object nesting, cross-referencing, data type extensions, sub-classing, and inheritance support functions. But perhaps the more powerful RDBMS engines and the extended versions of the same have now enabled object-oriented software applications to circumvent some of the traditional performance and storage issues that traditionally have occurred with RDBMS’s. This would also include circumventing some of the object-to-relational and relational-to-object translation mapping issues that traditionally occur when persisting/recovering software objects using an RDBMS. This translation mapping issue occurs here because the developer has to break the objects apart and then store the different object parts in the various database tables (or files) making up the RDBMS, thus making the translation mapping process tedious at best due to the resulting “semantic gaps” that occur. In any case, my assumption is that providing for the full functionality of an ODBMS, which would also enable developers to persist full class information (including object methods), is the biggest hurdle for the ODBMS genre to try to pass, thus rendering the ODBMS niche as being “too disruptive” of a technology for most IT management types.

Interested LinkedIn members are invited to join the ”Disruptive Technologies” professional group (URL below):

http://www.linkedin.com/groups?about=&gid=1027037&trk=anet_ug_grppro

Impact of Disruptive Innovations on Existing Business Models

leave a comment »

One factor to consider in the disruptive technology genre is the impact of the new technology (or innovations) on the existing business model.  It seems as if many disruptive innovations are really not “disruptive” in terms of the technological challenges presented, but are disruptive from the standpoint of the resulting business model challenges that don’t get managed properly.  Polaroid’s handling of the digital imaging technology when it was new is a real good example of this (see Case below).  It lends creedence to the premise that promising new technologies can end up falling through the cracks due to the failure of their supporting business models (and companies).  As a result, the necessary business model changes also need to be considered and implemented whenever a  new disruptive technology is being implemented in order to be successful.

Case -> Comparisons of Polaroid’s Film-Based Business Model with Apple’s iTunes Model –

I).   Being a technology driven company at the time, Polaroid was all about the technological challenges presented by the instant photo processing industry at the expense of the marketing challenges involved, thus resulting in its eventual bankruptcy filing.  When digital imaging technology came onto the scene Polaroid was able to deal with it from a technological standpoint, but it could not change its existing film-based business model (i.e., polaroid film sales) over to one based on digital imaging and processing (i.e., with no film involved).  As a result, the arrival of digital imaging technology served as a very disruptive innovation from a business model standpoint for Polaroid as it went from a state of being very profitable to a state of experiencing quickly collapsing revenues over a short period of time.  This was primarily due to Polaroid’s propensity to view the new digital imaging technology as representing a technological challenge only, thus ignoring the key business model challenges presented by this disruptive-innovation change.  The key point here is that the disruptive innovation dilemma not only involves technological challenges, but also represents critical business-model challenges as well if the disruptive innovation is not managed competently.

Note:  In Polaroid’s defense, there were numerous major hurdles involved with the restructuring of its overall business model which consisted of the following -> 1). re-educating its employees; 2). initial lower profitability; 3). current product-line cannibalization; 4). increased management/stakeholder/customer based conflicts; 5). complex organizational changes (i.e., corporate culture issues); and 6). inherent conflicts with its traditional (successful) core competencies. In effect, technology driven firms like Polaroid (and Kodak as well) would be better served if they were to also become driven to defiining the new business model(s) that need to be implemented in order to help propagate their disruptive-innovation based product lines (e.g., like Apple currently does).

II).  Also being a technology driven company, Apple too is driven by the technological challenges presented by the computer and electronics industries. But unlike Polaroid, Apple is also driven by defining the new business models that need to be adopted in order to help propagate its “disruptive innovation” types of products.  Apple’s iTune has basically turned the recording (or record) industry “on its head” in that consumers can now purchase and download individual songs at home instead of having to pay for entire albums of songs bundled onto CDs at record stores.  As a result of this “disruptive” business model, record stores are now a thing of the past as iTunes has revolutionized the music industry at the retail level due to its lower costs, increased conveniences,  and more desirable product selection changes.  As a result, some retail record stores have moved over to the movie/DVD side of the industry, but it is just a matter of time before this extended business model meets its demise too due to the arrival of even more disruptive innovations in the movie industry.  These innovations will primarily be based on new delivery technologies (e.g. better internet streaming methods, et al).

Interested LinkedIn members are invited to join the ”Disruptive Technologies” professional group (URL below):

http://www.linkedin.com/groups?about=&gid=1027037&trk=anet_ug_grppro

Will Apple’s ‘Disruptive Innovation’ Product Strategy Continue On?

leave a comment »

The following is a very insightful ‘Disruptive Innovation’ genre quote per Apple Computers from the 24/7 Wall St. blog article ‘Apple Clobbers the Competition: The Carnage of Apple’s Spectacular Success’ (by Ashley Allen & Douglas McIntyre, 2010).  http://www.marketwatch.com/story/apples-destructive-power-2010-09-24?pagenumber=2 

“Apple is an anomaly.  It has the disruptive force of a startup and the consistent message of a mature company.  It is one of the largest tech companies, and yet it is a darling of Wall Street.  It is a hardware company that is also software company, content company and now consumer-electronics company.  It has manufactured not one but three revolutionary consumer products back-to-back, and all on a content distribution model that seems to evolve with the needs of the product.  It’s a killer because it continues to be the first to market and often times the only game in town”.

Being a technology driven company, Apple is driven by the technological challenges presented by the computer and electronics industry sectors.  But Apple is also driven by defining the new business models that need to be adopted in order to help propagate its disruptive-innovation based products.  Apple’s iTune has basically turned the recording (or record) industry on its head in that consumers can now purchase and download individual songs at home instead of having to pay for entire albums of songs bundled onto CDs at record stores.  As a result of this disruptive business model, record stores are now a thing of the past as iTunes has revolutionized the music industry at the retail level due to its lower costs, increased conveniences, and more desirable product selection changes.  As a response to this, many retail record stores then moved over to the movie/DVD side of the industry, but this extended business model has now met its demise as well due to the arrival of even more disruptive innovations in the movie industry per new online delivery technologies (e.g., better internet streaming methods immediately come to mind).

I’d now like to expound on the premise that Apple has been a real master at disrupting the environmental scanning attempts of its competitors via a “sleight of hand” (or misdirection) strategy favored by the late Steve Jobs.  This strategy has always caused much consternation with Apple’s competitors and industry analysts alike in their attempts to interpret and follow the product-line direction that Apple (as industry leader) is heading.  Apple is also willing to canibalize its own existing product lines as part of this misdirection strategy, which is disruptive from the standpoint that most competitors find themselves unable to continue following Apple’s lead due to the fact that they cannot cost-effectively canabalize their own product line while following and competing with the industry leader.  In effect, this “follow-the-leader” strategy becomes cost-prohibitive for Apple’s more cash-strapped competitors, who then either phase down or abandon entirely the particular product line that they are competing with for market share (e.g., HP’s Tablet product-line immediately comes to mind).  Apple’s huge cash position basically enables it to “toy” with its product-line competitors and weed out those who cannot afford to stay in the game with them (i.e., just about everyone). But I’m sure that the remaining competitors are now reassessing (i.e., retooling) their competing product-line strategies in the wake of Steve Jobs’ unfortunate demise as Apple may no longer be able to disruptively innovate without him.

NOTE:  Interested LinkedIn members are invited to join LinkedIn DT Group ‘Disruptive  Technologies’  http://www.linkedin.com/groups?about=&gid=1027037&trk=anet_ug_grppro .

Does Predictive Analytics (BI) Field Represent a Potentially Major Disruptive Innovation?

with 6 comments

Does the burgeoning field of Predictive Analytics (PA) and the competitive advantage that it can potentially reap represent a potentially major disruptive technology of the near future?  Or will it go by the wayside (i.e., in terms of being disruptive) in the same manner that Artificial Intelligence (AI) computer technology and applications did in the 1980s and 90s?

Several big name companies like IBM and SAIC are making serious runs at developing PA based software applications as their major business models.  IBM is pulling out all the stops here as it recently bought out Cognos and SSPS for their PA assets and high levels of expertise in order to build up its new PA consulting model for its Global Business Services consulting group.  In addition, engineering firm SAIC has recently announced a major PA application system that is now market ready (i.e., SAIC plans to market this new PA application on a worldwide basis alongside its hardware and consulting services).

To elaborate, SAIC’s new “Distribution Monitoring System” is designed to proactively predict the occurrence of failures in distribution and transmission systems in a matter of days, weeks, or even months before they occur.  The utilization of a “complex-event” processing engine evaluates masses of data (i.e., data mining) against rules laying out the relationship between specific, fairly obvious events in the life of a particular device and the likelihood that such a device will fail (and when it will fail).  Millions of records containing event driven data can be run either daily or in real time against these rules of thumb (or indicators) that have been designed to identify potential failure points and the timing of their occurrence.  This process then utilizes a knowledge database that can correlate  faults and failures (i.e., it learns to proactively detect problems that can cause failures).  A neural network is then utilized which can determine whether a failure will occur at some point and can also assess, with a stated probability, when the failure will occur.

In support of the above premise that PA could become a major disruptive technology in the near future, a recent study by McKinsey Consulting Group infers that corporations are going to have to embrace disruptive technologies that will shape the new economic terrain that is evolving out of the latest global economic downturn.  As economies around the world emerge from the recent economic downturn, many companies are starting to grasp that what follows most likely won’t be just another typical turn of the business cycle.  The resulting new economic terrain will undoubtedly be shaped by persistent uncertainty, tighter credit, lower consumer spending, greater consumer saving, and more pronounced government involvement in business (i.e., McKinsey terms this as being the “new normal”).  The use of powerful PA and Business Intelligence (BI) technologies may be the difference maker for companies in terms of removing the persistent uncertainty factor and, as a result, being better able to proactively address potentially serious problems before they become detrimental to the bottom line (i.e., a distinct competitive advantage).  The premise here is that those organizations that don’t invest heavily (or effectively) in PA and BI technologies may be left behind the “eight ball” in the currently evolving new economic order.

So the key question here is, does the burgeoning field of Predictive Analytics and the competitive advantage that it can potentially generate represent a major disruptive technology of the near future (i.e., in terms of dominant business models and profitability)?  Or does it represent just another trendy (and costly) “fad” that will go by the wayside without much impact?  My bet is on the former.

Addendum:  One factor to consider in the disruptive technology genre is the impact of the new technology (or innovations) on the existing business model.  It seems as if many disruptive innovations are really not “disruptive” in terms of the technological challenges presented, but are disruptive from the standpoint of the resulting business model challenges that don’t get managed properly.  Polaroid’s handling of the digital imaging technology when it was new is a real good example of this (see Case below).  It lends creedence to the premise that promising new technologies can end up falling through the cracks due to the failure of their supporting business models (and companies).  As a result, the necessary business model changes also need to be considered and implemented whenever a  new disruptive technology is being implemented in order to be successful.

 Case -> Comparing Polaroid’s Film-Based Business Model with Apple’s iTunes Model –

I).  Being a technology driven company, Polaroid was all about the technological “challenges” presented by the instant photo processing industry at the expense of marketing challenges involved (resulting in Polaroid’s eventual bankruptcy filing).  When digital imaging came on the scene Polaroid was able to deal with it from a technological standpoint, but it could not change its existing film-based business model based on polaroid film sales over to one based on digital imaging/processing  (i.e.,  no film involved).  As a result, the arrival of digital imaging technology served as a very disruptive innovation from a business model standpoint for Polaroid as it went from being very profitable to “collapsing” revenues in a short period of time.  This was evidently due to Polaroid’s propensity to view the new digital imaging technology as a technological challenge only, while ignoring the business model challenges presented by the technological change.  The key point here is that disruptive innovations are not primarily technological challenges, but actually business model challenges instead if not managed competently.  

Footnote:   In Polaroid’s defense, there are numerous hurdles involved in restructuring business models, which include  a). re-educating employees;  b). initial lower profitability;  c). current product cannibalization;  d). increased  management/stakeholder/customer conflicts;  e). complex organizational changes (including culture, etc.);  and f). conflict with traditional (i.e., successful) core competencies. 

 II).  Also being a technology driven company, Apple too is driven by the technological challenges presented by the computer and electronics industries. But unlike Polaroid, Apple is also driven by defining the new business models that need to be adopted in order to help propagate its ”disruptive innovation” types of products.  Apple’s iTune has basically turned the recording (or record) industry “on its head” in that consumers can now purchase and download individual songs at home instead of having to pay for entire albums of songs bundled onto CDs at record stores.  As a result of this “disruptive” business model, record stores are now a thing of the past as iTunes has revolutionized the music industry at the retail level due to its lower costs, increased conveniences,  and more desirable product selection changes.  As a result, some retail record stores have moved over to the movie/DVD side of the industry, but it is just a matter of time before this extended business model meets its demise too due to the arrival of even more disruptive innovations in the movie industry.  These innovations will primarily be based on new delivery technologies (e.g. better internet streaming methods, et al).

NOTE:  Interested LinkedIn members having disruptive technology interests are invited to join LinkedIn Group ”Disruptive Technologies” http://www.linkedin.com/groups?about=&gid=1027037&trk=anet_ug_grppro .

Tech Professional Certification: ICCP’s CCP vs. PMI’s PMP Debate.

with 3 comments

For starters let me state that I do value my Certified Computing Professional (CCP) certification and display it proudly (along with the certificate number) on my resume and LinkedIn profile.  Originally my certification was the Certified Data Processor (CDP) designation, which I obtained in 1990, with specializations in Core IT Skills, Systems Development, and Management.  These specialties complement my MBA degree, which I obtained in 1989 from the University of Houston with concentrations in Management Information Systems and Management (and now Finance as well).  In addition, it was an MIS professor at UH who recommended that we obtain the CDP certification to complement our MBA (in MIS) degrees with.  Then the CDP certificate became the CCP at some point during the 1990s due to the restructuring/reorganization move that occurred within the Institute for Certification of Computer Professionals (ICCP) at the time.  Anyway, I believe that the name recognition and status that was then associated with the CDP designation went out the door somewhat with the ICCP mandated name change.

As far as the Project Management Professional (PMP) certification is concerned, it seems as if many of the IT team lead and/or project management job requirements being published these days either require the PMP or state that it is a desired credential.  This is why I have stated in previous discussions that the PMP appears to be the credential in vogue these days.  Again, this is primarily due to its claimed recognition of demonstrated knowledge and skills in leading/directing project teams and delivering project results within the defined constraints of schedules, budgets and/or resources.  These skills are what technical project management types are distinctly looking for these days, so perhaps the ICCP should tailor the CCP exams to become more like the PMP, but with a strict adherence to IT applications (i.e., not the generic things that could apply to any technical/scientific project like engineering, etc).

In addition, I firmly believe that the Project Management Institute (PMI) does a real good job of promoting and/or marketing the value-added properties of the PMP certificate to companies and consulting firms (e.g., Accenture, IBM, et al).  And a big part of the issue here has to do with the perceptions (i.e., image) that are out there.  As a result, maybe the ICCP needs to gear up and study the techniques being used by the PMI in order to see what could be implemented to promote the same value added properties of the CCP certificate to IT organizations.  Perhaps the CCP should be marketed as being a certificate that proves to IT (and user) organizations that the holder has obtained the requisite proficiencies in both the basic IT technologies and project management areas.

In terms of comparisons, the highly regarded Certified Public Account (CPA) and Chartered Financial Analyst (CFA) designations both command a lot of respect within professional circles due to their being required certifications in order to practice and advance within the applicable professions.  In addition, they are both difficult to obtain and take countless hours (and dollars) of dedication and hard work just to get prepared to take each level of the multi-level exams.  There are always a large number of candidates willing to go through all of the “hoops” to obtain these certifications due to the high level of professional status that they both afford and because the top-tier (i.e., high paying) firms require them as basic credentials in order to become employed and move up the ladder to more lucrative positions.  If these requirements were not in place, then no one would be willing to go through the time and expense involved to obtain these two certifications.  Therein lies the problem for the ICCP -> because the CCP is currently not a required credential within the IT project management world, it can never become the demanding, highly statured designation that the CPA and CFA have become. The PMI folks seems to recognize this issue and are doing a good job of promoting (or marketing) the PMP certificate as being a required credential for IT and other types of technical project management openings.  In addition, one other factor that lends credibility to the PMP certificate is that the PMI requires a minimum number of years in a leadership position (e.g., 4 years of project management work) on top of passing the difficult exams in order to get chartered as a PMP.  

One caveat to the importance of certifications is that alot of them seem to come and go based on changes in technology and business models.  I remember when the Microsoft certifications were hot items for a few years (in the 1990s), to the point where MS had convinced IT professionals and companies that it was a necessary credential in order to work on MS products.  But this is no longer the case as the existing products have evolved and other companies’ products have eclipsed some of MicroSoft’s products in the industry, thus rendering the old MS certifications pretty much useless.  The CCP and PMP should not have this particular problem, but it just seems as if PMI is doing an excellent job of promoting the PMP as the certification of choice in the IT world and is convincing alot of companies that it needs to be a required certification (i.e., much like the CPA and CFA). 

In conclusion, the ICCP has a lot of catching up to do if it wants to keep the CCP designation from being entirely eclipsed by the PMP designation over the next few years as a lot of companies seem to be getting on board with the PMI’s mantra.  As a result, I firmly believe that the ICCP needs to study the successful marketing techniques being employed by the PMI in order to emulate them for use in promoting the CCP certificate and bringing it up to par with the PMP in terms of name (or brand) recognition within the IT project management world.  One advantage that should be publicized by the ICCP is that one must pass all ICCP exams with a score of 70% or higher in order to get certified as a CCP; whereas one only needs to pass the PMI exams with a score of 60% or higher in order to get certified as a PMP.  In addition, the ICCP exams are applicable strictly to the IT sector of technology, whereas the PMI exams are not really IT specific in scope as they seem to try to encompass all types of project management sectors (e.g., engineering, etc).  So there are several areas of advantage that the ICCP has over the PMI in the IT certification sector that need to be exploited and publicized by the ICCP in order for it to become a more recognized player in the industry.

NOTE:  Interested LinkedIn members having IT/Technological interests are invited to join LinkedIn Group ”Disruptive Technologies” at http://www.linkedin.com/groups?about=&gid=1027037&trk=anet_ug_grppro .

Administration Vows to Get Tough with China?

leave a comment »

Due to the large amount of U.S. government debt currently being held by China, we are pretty much at its mercy and remain beholden to it as one of our largest (i.e., most manipulative) debt holders.  This is not a good position to be in!  We can make demands and issue hollow threats all we want, but when we need to borrow more money or refinance the huge debt of ours that the Chinese currently holds, then we end up having to acquiesce to doing things their way again (i.e., the same old broken record).  Insisting  that China should implement the same types of labor laws, products pricing mechanisms, and currency regulations that put us in an uncompetitive position in the global economy is laughable when they are the major holder (i.e., manipulator) of the U.S. government’s debt.  The U.S. dollar itself is also a potential hostage here as China’s large $2.45 trillion collection of foreign exchange reserves consists of a tranche of U.S. dollars that makesup approximately 67  percent of the entire collection, giving China some absolute leverage over the USD’s value should it decide to make significant “policy” changes.

So, per the old saying “Those who pay ($) have the say”, China is definitely in the driver’s seat in terms of driving the global economy; they are definitely beating us at our own game.  I assume that Richard Nixon and Henry Kissinger never foresaw this in their wildest dreams when they implored Maoist China to open up to the West in the early 1970’s.  The recent paradigm shift over there has been amazing; I just keep waiting for their oversized (and state induced)  bubble to pop.  If and when it does, then watch out rest of world as the resulting collateral economic damage will be widespread!

Written by Larry Fry, CCP, MBA

September 16, 2010 at 1:24 pm