System Centering Solutions's Press And Articles

The Voice of Data Quality: Neither an Echo nor a Form of Confirmation Bias

December 2, 2016

As published on www.dataversity.net on December 02, 2016 In the aftermath of the U.S. presidential election, and amidst cries about “the day data died,” it is fitting to respond to the purported demise of data and questions about the value of this subject in general. Let me, then, state at the outset that data are alive and well: It is the interpretation of data – selective by many and prejudicial by many more – that makes it seem that this material is irrelevant; that it has no voice, so to speak, except the one we choose (often erroneously) to give it; that the numbers are meaningless because, as Donald Trump’s victory over Hillary Clinton allegedly demonstrates, we should not trust this or any other kind of data.   In point of fact, the election should put an end to confirmation bias, not a stop to data. For the latter has a voice – it is the signal that separates itself from the noise – and it must be our responsibility not to confuse that sound for something it...

Can Data Science and Big Data Improve Design?

October 21, 2016

As published on www.dataversity.net on October 21, 2016A question for Designers and Data Scientists alike: Can members of the latter empower representatives of the former? Which is to say, can design – a discipline dependent on the artistic ability and the qualitative skills of a given person – become better and more effective, because of the quantitative knowledge of a specific group of experts? Can, in other words, Big Data improve design and create a greater emotional response among consumers? The answer is: Yes. Big Data can reveal certain preferences, and confirm the numbers behind those preferences, involving why people like sites that have, say, a particular aesthetic and a distinctive layout. While that information will not transform you into artist, and though that material will not bless you with an intuitive eye for how to draw, sketch or paint, it will make an already talented Designer a more effective user of this digital domain of creativity.  For we now have the...

Democratizing Big Data: Empowering Entrepreneurs Worldwide

October 12, 2016

As published on www.foundersguide.com on October 12, 2016Democratizing Big Data, making it more accessible and affordable, is the key to transforming the way entrepreneurs do everything from overseeing website development and design to the manner by which they communicate with current and potential consumers. This shift, local in its ability to help entrepreneurs more accurately reach their respective communities and global in its implications for businesses of every size and interest, represents a permanent shift towards the creative executive – the enterprising individual – who can leverage this information for the good of his company and the benefit of the public as a whole.  Translation: Since this material is no longer the exclusive province of large corporations, and since independent experts can apply this data for entrepreneurs worldwide, every startup now has an advantage of incalculable value and immeasurable convenience.  On a practical level, this...

Big Data: The Key to Big and Successful Marketing

August 22, 2016

As published on www.dataversity.net on August 22, 2016 Some words about Big Data: It is irrelevant without big marketing, the kind of outreach responsible for connecting with consumers, resonating with listeners or viewers, and echoing throughout the auditorium or arena for an audience of interested attendees. For Big Data is like a map that an increasing number of experts can read, which for the purposes of this metaphor means the map itself (never mind the mapmaker) costs less money. When interpreting the information is more affordable than ever, and when access to that material is no longer the sole province of big corporations, the onus shifts to interpreting said content. Hence the rise of big marketing, which is another way of saying smart – and targeted – outreach to relevant buyers, shoppers, customers and potential leads.   So, while the analytical aspect of this job is no longer expensive, we should turn our attention toward the difference between conventional...

Converting Big Data Into Conversational Gold

July 25, 2016

As published on www.dataversity.net on July 25, 2016If data is a form of language – if all those ones and zeroes constitute a way of communication, then translating those figures into something intelligible for a mass audience should be the end-result of this phenomenon we call Big Data. It should be the culmination – no, the conversion – of a series of commands into a chance for discussion among the very people who represent the numbers flashing on hundreds of millions of screens, and causing the blades on those fans – in those server farms – to spin in seeming perpetuity. I give you, dear reader, the rise of conversational marketing, thanks to more affordable access to data.   Does this, then mean that we have some sort of alphanumeric way of helping companies reach the right consumers, at the right times, for the right reasons? In not so many words, yes; we have the experts to dissect the Analytics, and we have the economies of scale to make what was once the...

How Data and Conversational Marketing Helps Social Media’s Voice

June 24, 2016

As published on www.socialmarketingfella.com on June 24, 2016Fact: Big Data is nothing more than an empty term, a euphemism among the technorati, unless it produces the sort of “conversational marketing” necessary for social media; unless it yields superior content and inspires a dialogue among consumers; unless it showcases messaging that resonates because of the quality of its content and the eloquence of its voice; unless, in short, it is authoritative and authentic, believable because of the caliber of its story and the sincerity by which it tells its tale.   The good news is that data contain the information necessary to achieve that goal. The even better news is that data can strengthen everyone’s ability to communicate, transforming social media into a destination worth visiting and a site (or a series of sites) worth reading.   That means the style of writing, on and throughout the various sources of social media, should also rise. Rather, there will be a greater...

The Triumph of #BigData Relevance in the Social Media Revolution

June 8, 2016

As published on www.socialmarketingfella.com on June 8, 2016The triumph of Big Data may take many forms, but it already has an impact – a large one – within the realm of social media. By having entrée to key analytics, and by making that information affordable and accessible to as diverse an audience as possible, companies and consumers can have a more insightful – and sincere – conversation about their respective brands, and products and services.  Gone is the blather that pollutes too much of social media, thereby compelling businesses to use data to their advantage. Gone, too, is the meaningless quest for likes and followers, the worthless expense of purchasing bogus accounts and fake profiles. Gone, in short, is the way companies use social media, as if a series of sentence fragments and links constitutes a dialogue or a meaningful exchange of ideas. Perhaps the biggest change involving this situation is cost: If an entrepreneur can retrieve the same degree of...

Big Data Meets the Hospitality Industry: A Revolution in Marketing and Communications

April 8, 2016

As published on www.dataversity.net on April 8, 2016Perhaps the industry most awash in data is the one few people would imagine as so dependent on interpreting – and applying – that collection of ones and zeroes into actionable intelligence.   I refer to the hospitality industry – including hoteliers and their respective executives – that will use Big Data to transform the way various properties market to business and leisure travelers.   This revolution is, in fact, a twofold phenomenon: On the one hand, there is the sheer amount of data that is now available, including information about individual travelers and corresponding groups worldwide; while, on the other, the cost of analyzing and dissecting that data is now accessible to all because it is affordable for all.   I write these words from experience, since I am a strong advocate of the democratization of data. For it is that very event – the freedom to have professionals provide, translate and deploy...

The Big Data Revolution Within the Hospitality Industry

March 29, 2016

As published on www.hotelexecutive.com on March 29, 2016Big Data will provide a complete "digital profile" of current and prospective guests, enabling hotel executives to create more effective marketing and communications campaigns. This opportunity, available for all and affordable to all, will transform the way hoteliers interact with travelers; it will revolutionize this relationship for the better by making outreach more direct, personal and relevant. Thus, these benefits are too important to ignore - they are too substantial to dismiss - since the result will be a more intimate and gracious expression of loyalty from hotel executives on behalf of their most loyal supporters. Welcome to the big dividends of Big Data.   Attention, hotel executives: Every morsel of information about every guest - past, present and future - already exists; it is available for you to analyze, scrutinize, read, review and examine; it is decipherable, thanks to a revolution in technology and a...

Mining Data and Making Social Media More Sociable

January 26, 2016

As published on www.hotelexecutive.com on January 26, 2016Data and social media are the twin forces of change within the hospitality industry. Maximizing the power of these tools - for the good of workers and guests alike - is (or must be) a hotel executive's principal responsibility. Data and social media are the currency of the Web, with their respective collection and conversion of so many ones and zeroes into words and actionable intelligence. They are also the indispensable ingredients of the hospitality industry. Whatever label we assign to the first of these two forces, whether we speak of Big Data or information in general, one thing is certain: Access to that material is no longer the exclusive province of the world's top hoteliers and premier digital marketers - and that is very good news for travelers and hotel executives alike.   By democratizing data, by automating this process (for convenience) and streamlining this concept (for greater affordability), hotel...

Democratizing Data and #SocialMedia: Mass Meets Class

December 21, 2015

As published on www.socialmarketingfella.com on December 21, 2015​The effective use of social media, as well as access to Big Data, should not be – it shall no longer be – the exclusive province of global corporations with seemingly endless marketing budgets and a near-constant presence on every relevant online platform and feed. For there is no reason why entrepreneurs and medium-sized companies, from individuals with a passion for technology to organizations with a commitment to engaging consumers, should face the false choice of mass versus class; which is to say, there is no bar that prevents a business from reaching a substantial audience with a customized message that is as elegant as it is eloquent. Indeed, the latest chapter in the ongoing story of marketing and communications is, perhaps, the most important section of all: The democratization of data and affordable access to experts fluent in the language of the Web; professionals who forgo the generic for the...

Democratizing Big Data: Expanding Knowledge and Empowering Entrepreneurs

December 7, 2015

As published on www.dataversity.net on December 7, 2015 For all the talk about Big Data, and despite the multitude of conflicting definitions about this term, there is nonetheless a movement underway to “democratize” this issue; to automate and analyze information for small to medium-sized companies, thereby enabling these businesses to be more responsive to the needs of consumers and the wants of would-be customers worldwide.   This milestone is important because it levels the proverbial playing field between global corporations, with whole departments and experts dedicated to translating the language of the Web into customized messaging, and businesses with fewer resources – with much fewer dollars – necessary to otherwise stay competitive in this contest for the attention (and money) of consumers in the Americas, Europe, Asia and elsewhere.   I write these words from experience, where, in my role as Founder of Ocoos.com, I seek to make data more affordable...

Dr. Rahul Razdan, founder and CEO of Ocoos, Ocala, Fla.

September 28, 2015

As published on www.business-superstar.com on September 28, 2015 Dr. Rahul Razdan is no stranger to the business world: He holds a PhD in computer science from Harvard, is named on 24 issued patents and has over 20 of years executive management experience in a variety of roles in sales, R&D, and marketing. His latest endeavor is Ocoos, a cloud-based platform that enables business owners to build a world-class marketing solution for their distinctive interests. Q: What inspired you – and who encouraged you – to become an entrepreneur? Dr. Rahul Razdan: As the founder of Ocoos, which provides a variety of high-quality business-to-business services to mid-size companies worldwide, my entrepreneurial drive – this union, to borrow the words of Steve Jobs, of technology married with liberal arts, married with humanities – enables me to combine my technical skills as a computer scientist with my enterprising spirit for launching (and building) companies of exceptional quality and...

Why Outsourcing Social Media Is a Task for Professionals, Not Professional Interns

August 24, 2015

As published on www.rescue.ceoblognation.com on August 24, 2015 Companies can and do outsource a great many things. That comment is neither a criticism nor an expression of sorrow; it is, instead, a statement of fact.   Businesses outsource projects like web design and copywriting, as well as management of real-time traffic and the creation of intelligible reports about Web analytics.   The quality associated with outsourcing is, in the end, no different than the caliber (and accountability) of the work assigned to an in-house employee, meaning: If you have the right professional, someone who honors the most expansive definition of that word, technology shrinks or eliminates distance; it broadens (and often improves) the pool of talent, without regard to the physical limitations of recruiting candidates within a particular state or city.   Where outsourcing will make or break a brand, where it has the power to attract friends and followers (or alienate customers and repel...

Ocoos, Your World Class Website Solutions for Services and Complex Products

August 12, 2015

​As published on www.epodcastnetwork.com on August 12, 2015 Podcast: Play in new window | Download (Duration: 10:57 — 7.5MB)Dr. Rahul Razdan, the Founder and CEO of Ocoos, your world class website solutions for services and complex products joins Enterprise Radio. Dr. Razdan has authored numerous technical papers and is named on 24 issued patents.   Listen to host Eric Dye & guest Dr. Rahul Razdan discuss the following:   For the benefit of our listeners, please give us a summary about Ocoos. How do the services available from Ocoos level the proverbial playing field between entrepreneurs and larger corporations? Since Ocoos offers a top-to-bottom series of tools and resources, how does this consistency of quality help the end-user of, say, a startup or a small business? Please give us an example of some practical successes involving Ocoos.   Duration: 10:57  Dr. Rahul Razdan has over 20 of years executive management experience...

Using the power of the network to amplify your brand and online sales

August 9, 2015

Enterprises in various industries engage with networks of small businesses as distributors, suppliers, or customers. In general, independent small businesses have limited expertise and capability in sales/marketing. According to Google, 50% of these businesses do not have websites and 90% of them do not have websites optimized for mobile platforms. This is all in the context of 90% of consumers going to the internet as their primary method for research and increasingly doing so on mobile platforms. The situation is so extreme that Google has announced that it will penalize non-mobile optimized websites in their search ranking [mobilegeddon].   In this article, we will show that by meaningfully engaging with their network on the topic of online infrastructure, large brands can create a keiretsu effect which can amplify their brand, drive incremental sales, and strengthen their network.   Network Marketing for Distributors:   Enterprises such as insurance companies, financial...

Quick, Smart, and Responsive

July 7, 2015

As published on www.qsrmagazine.com on July 8, 2015 A democracy of data for the quick-service restaurant industry.The quick-service restaurant industry is built on the delivery of food—and data.  The former is obvious, complemented by the physical layout and the queue of customers, standing or seated, waiting, respectively, to place their orders or receive their meals. But the latter is the proverbial secret sauce, the (data) packet of ingredients that, when arranged and made accessible to a business owner, reveals valuable information about the interests, habits, and expectations of consumers. Translating that material into actionable intelligence, converting the language of the Internet with its collection of so many ones and zeroes into a series of facts, is the essence of data.  Whether we label that content “Big Data,” which is a media catchall for anything and everything that bears this stamp of importance, the point is this: Deciphering data, finding the details that...

Empowered by Data and Inspired by Words: Successful Use of Social Media

June 2, 2015

As published on www.thesocialmediamonthly.com on June 2, 2015 There is an online collision between data and deeds, between what we can now learn and what we continue to say and do within the world of social media. Think of the former as the savior of the latter where for reasons of pride and conjecture, and a sort of convenient deafness to the rhythms and cadences of communications (between the vernacular of Twitter, and the long-form writing of blogging and the conversational atmosphere of Facebook), too many companies miss the chance to enjoy the benefits of social media. This problem is neither too difficult to solve nor too costly to answer, because there is a revolution afoot; it is as real as any other milestone in the history of the Web, and as permanent as any shift in the relationship between companies and their respective consumers.   I refer to the revolution of access to Big Data; the once-exclusive domain of global corporations with seemingly unlimited resources,...

Hospitable by Nature and Intelligent by Design: Technology Solutions for Hotel Executives

April 13, 2015

As published on www.hotelexecutive.com on April 13, 2015 Hotel executives face a challenge and an opportunity, both of which have their roots in the disruptive power of technology.   At one extreme, there are the makeshift, room-for-rent social entrepreneurs - the men and women who offer overnight accommodations (courtesy of a spare bed, couch, futon or floor) to travelers in a major city - while at the other end of the spectrum there are conventional hotels and resorts.   The latter, despite their more spacious and inviting arrangements, including housekeeping, room service, plush decorations, magnificent views and in-room entertainment; among these conventional hoteliers and high-end, five-star brands, there is a keen need to more effectively reach potential guests and adapt to this hyper-competitive environment.   The only way to achieve that goal and increase occupancy rates, without succumbing to downward pressure from less traditional players in this space, centers...

Design and Data: The Essential Elements for Online Success

April 7, 2015

As published on www.rescue.ceoblognation.com on April 8, 2015 If there are two elements, which every business owner should have and every industry should possess, few are more essential than design and data. Or: If a company is to succeed online – if it seeks to make an impression with style and substance, through a combination of intelligence and analysis – then it must have a distinctive website, which gathers and mines data about current and prospective clients.   To ignore or dismiss the importance of the former is to squander the benefits of the latter; it is to divorce style from substance, when, in fact, the two are inseparable. That is, a company cannot have a generic website (and by “generic,” I believe most people know it when they see it, with the unmistakable tabs, quartet of colors, boring text, and graphs and pie charts) because the lack of an online identity – the absence of personality – will fail to attract new visitors, and quickly try the patience...

Method, apparatus, and computer program product for facilitating marketing between businesses

April 2, 2015

A method, apparatus and computer program product are provided for facilitating a marketing interlock between businesses. Two or more businesses may enter into a co-marketing campaign in which a sponsoring entity funds an advertisement of a sponsored entity on a third party advertising system, such as a search engine. Marketing content of the sponsoring entity is inserted on the website of the sponsored entity. Traffic originating on the third party advertising system may therefore first be driven to the sponsored entity's website, and subsequently to the sponsoring entity's website, thereby achieving a mutually beneficial co-marketing relationship. Marketing relationships among complimentary businesses based on provided service and location may also be facilitated according to the methods provided (Full Patent Here).

The Real Estate Agent With The Technology to Succeed: The 'Democratization of Data'

March 8, 2015

As published on www.realtytimes.com on March 8, 2015 Real estate professionals are more than experts about the properties they represent and the clients they advise. They are also champions of technology, advocates for new tools and resources to elevate their online visibility and more effectively market to prospective buyers and sellers.   To do that job well -- to have a distinctive website, one with its own high-quality design and easy navigability, which simultaneously captures data about individual clients -- traditionally requires a substantial investment in multiple developers, programmers, analysts and account executives.   For it is that very job, with its emphasis on personalization (each site should have its own identity and messaging) and scientific marketing (every professional should lessen the scattershot approach to communications, which is costly and hard to evaluate), which promises to transform the real estate industry into a more targeted and successful...

An Integrated Platform for Health and Wellness: The Next Generation of Websites

March 3, 2015

As published on www.corporatewellnessmagazine.com on March 3, 2015 Corporate wellness and technology are increasingly inseparable: The latter enhances the former by elevating a doctor’s presence online (to cite one notable example) and educating prospective patients about relevant news, trends and studies within a particular area of the health care industry. Accomplishing that goal should not require a physician to master various forms of software and coding, nor should it consume valuable hours performing administrative tasks at the expense of helping the sick and empowering the aged. Thankfully, we are in the midst of a renaissance involving the way physicians can easily establish their own respective messaging and outreach to specific individuals. We are at a moment of economic and technical transformation, where the costs for creating, marketing and mining a website – the trio of services responsible for launching a site, customizing its design and features and...

Is There a Way the Internet Can Actually Generate Jobs?

August 14, 2014

Since the great recession of 2008, unemployment has been stubbornly high. However, the level of unemployment diverges greatly between college educated (under 4%) and non-college educated (over 10%). Further, for the non-college educated youth, the situation is truly dismal. Is there any way to address this situation?   At Ocoos, we offer an internet platform for the Small and Medium Business (SMB) marketplace. Much like Facebook for social networking, the Ocoos platform does not require any programming skills to operate a full marketing/sales infrastructure. As we have engaged with the SMB market we have made the following observations:   A significant percentage of the SMB market consists of businesses run by owners who are uncomfortable with technology. They are even uncomfortable with computers much less the internet/cloud/mobile.As described in (Small Service Providers and the Digital Age) these businesses recognize the need to engage with the internet, but are not equipped...

A Marketing Services Agents: Is an Insurance Agent Model the best way to get small businesses onto the Internet?

August 6, 2014

Large enterprises with significant marketing budgets can afford to pull together a coherent marketing plan which coordinates branding, promotion, and advertising.. They do so internally or by hiring traditional advertising agencies. However, most small and medium sized entities (SMB) cannot afford a marketing agency. They are left with three significant issues:Rising Complexity: The sheer choice of available promotion opportunities is mind boggling. These include opportunities ranging from physical marketing materials (business cards, brochures, etc), media (magazine, radio, TV, newspaper, etc), outdoor advertising (billboards, signs, etc), and most recently, the Internet (website, search, social media, etc).Expertise: In most cases, SMB staff do not have the expertise to pull together a coordinated marketing plan which matches their business needs, and access to mix of resources (analytics, design, creative) needed to execute that plan.Bandwidth: SMB enterprises are constrained on...

Optimizing Schedules for Service Providers While Making Your Customers Happy!

June 13, 2014

Service providers ranging from professions as disparate as pet groomers, cpas, or golf pros have the challenge of optimizing their schedule in the face customer expectations. Part of the difficulty is that various customers value their time very differently. Let’s consider three parties: Time Sensitive Customer: A busy business owner or executive who values their time, and are typically willing to pay more for expedited service. Price Conscious Customer: Meanwhile, a retiree may well have the luxury of time, but is very sensitive to price. Service Provider: Typically, a service provider lives in a world where there is either high demand or no demand. The task of optimal scheduling is very very difficult, and the consequences of getting is wrong are dire. Further, if the service provider is mobile, all the problems of travel are added to this issue. Is there a solution? Yes. There is patented research from the University of Florida and implemented in the Ocoos (www.ocoos.com)...

The Next Step in Business Referrals

June 13, 2014

Figure 1: Networking of BusinessesSmall businesses know the power of networking in order to gain customers. With this in mind, many small businesses participate in a number of activities in order to gain customers. These include:Local BNI groups focused on referralsLocal Chambers in various networking eventsBusiness AssociationsWhile effective, these mechanisms rely on physical interaction. This means that it takes time and is expensive. Can the Internet help?Today, one can go onto the Internet and build a web presence. With this web presence, millions of businesses try to get the attention of the search engines. Much like a blade in a field of grass, all businesses start looking the same! In addition, currently the Internet has no way to expose the reputation that small businesses have built over many years, nor the partnerships which have been forged/ tested over time.Figure 2: Website Integration   With a patented technology, Ocoos solves this problem with its patented B2B...

What muscles do you need to build to transition from corporate role to a startup?

June 12, 2014

The idea of building your own company is very attractive for corporate executives. This sounds great in theory, but the next question becomes what kind of startup exactly? This is the place where building three types of muscles not found in a corporate environment become very important. Idea Pipeline: In a corporate environment, there is a focus on execution and to a smaller degree on innovation around the strategy of the company. However, when evaluating a potential startup, it is very important to expose yourself to an idea pipeline which stimulates your thinking process. Great techniques for gaining access to an idea pipeline include: Local Research Universities: Connect with the patent licensing office or a research group of your interest. They are looking at the state-of-art, and the licensing function provides great access to leading edge researchers. Angel Investor Groups or in rare cases, local venture capitalists. Both groups are looking for knowledgeable people to help...

Making Lemonade out of Lemons: Software startups outside of Silicon Valley

June 12, 2014

Silicon Valley enjoys many advantages which has made it the juggernaut for software startup activity around the world. Among the significant advantages include:   Access to Talent: Silicon Valley is a magnet for talent. This starts with the world-class universities such as Stanford or Berkeley which attract talent from around the world. In addition, Silicon Valley benefits from "immigration" not only from overseas, but more significantly from other parts of the United States.Access to Capital: Over the years, Silicon Valley has developed a broad/deep institutional venture capital community. However, just as importantly, there is a broad and deep community of angel investors. For most software startups, a small angel round is often the crucial starting point.Access to Customers and Partners: Silicon Valley is a large population center, so there is easy access to a large local consumer market. In addition, for any software startup looking for partners or commercial customers,...

Is the Internet the solution to non-college unemployment?

June 12, 2014

Since the great recession of 2008, unemployment has been stubbornly high. The level of unemployment diverges greatly between college educated (under 4%) and non-college educated (almost 10%).   Like many of you, I have always thought about the Internet as a disruptive technology which can bring great efficiency to various markets, but certainly does not help with employment. In fact, one could reasonably argue that platforms such as Amazon or Craigslist have had the opposite effect in their respective industries. However, the small business segment, the biggest segment driver of job growth, the story might well be different.   Small businesses (over 28 Million in the US alone) offer critical products/services, which are valuable for their customers. However, unlike large enterprises, it is exceedingly difficult for small businesses to excel in all the key aspects of their business including business development, marketing, administrative operations, and customer support....

Technology Insurance for Small Businesses

June 12, 2014

When operating a small business, one has to worry about keeping current with technology to connect with the appropriate target customers effectively. However, the world of technology moves at a very rapid pace. In just a few years, social networking and mobile devices have become critical to the success of any business. Most small businesses built their marketing and sales infrastructure (websites, ecommerce) with no real thought towards technology obsolescence. Keeping up with these technological changes can be expensive both in time and money. The cottage industry of web developers who have traditionally provided solutions have not helped the situation because they have built one-time custom solutions.   How does one insure against this risk of technological change? A small business owner has had to deal with insurance - liability, workman’s compensation, etc. But never before has someone had to worry about technology insurance.   How does one ensure against technology...

Three Reasons You do NOT want to build a Wordpress Website!

June 12, 2014

The current trend in the world of website development is toward mass customization through the use of content management toolkits such as Wordpress. What do we mean by “mass customization?” Every website is built with a different layout and functionality. This article argues that building websites using this model is very counterproductive for the vast majority of small business firms. In fact, this approach actually impedes the generation of economic value. Why? There are three significant reasons. Maintenance Costs: When a small business firm builds a website, either directly or through web developers, it has implicitly committed to the lifetime costs of maintaining that website. There are a vast number of reasons for updating a website, including a redirection or new opportunity for the business, an update in technology (the introduction of the iPad, for example), or simply a desire to take advantage of new web technology (liked Linkedin). There are many examples of small...

What is the Next Step for Specialty Retailers (Petsmart, Staples, FootLocker,Office Depot etc)?

June 12, 2014

In the last 10 years, specialty retailers have grown rapidly. The list of successful companies include names such as Staples, Dick’s Sporting Goods, Petsmart, Petco, Sports Authority, Hobbylobby., etc. Recently, many of these specialty retailers have encountered challenging times due to the disruptive business models coming from the likes of Amazon, Walmart and Target. In this thought piece, we assert that specialty retailers could significantly benefit from leveraging local SMB service providers as valuable customers and partners. As a customer, the SMB community is a valuable market which can be large in its own right as well as provide thought leadership to the broader customer community. As a partner, the SMB community can provide valuable complementary services which can be “networked” through the specialty retailer (similar to Apple’s iTunes store) to complement their traditional product sales.   With this strategic direction, specialty retailers can capitalize on...

Where are the Solutions for Service Based Businesses?

June 10, 2014

Over the years, innovation has spurred both growth and efficiency in the distribution sector for products. Today, as a product maker, it is easier than ever to market and sell your offerings. The significant innovations for products which include mail-order catalogs (Sears), superstores with optimized logistics (Walmart) , specialty superstores (Dick’s, Staples, Bass Pro Shops..etc), and finally into the world of ecommerce with Amazon/ Ebay.   Service providers have seen no similar increase in efficiency. At best, one can say that Yellow Pages has provided a directory (online and in book) and review systems (Yelp, Google) have provided a method for lowering the barriers for engagement. However, largely suppliers of services are still in the same position that product makers were decades ago. There is a great need for services-based businesses to get the tools to manage their business as well as network with complementary businesses to build larger more interesting value...

Wireless Power for Heart Implant

May 20, 2014

Stanford has prototyped the ability to wirelessly charge pacemakers (Click here). Wirelessly charging pacemakers seems like an idea which makes a lot of sense. In fact, when at WiPower we explored this area with several medical device makers, and the results on the business side were surprising. These included:The current battery in a pacemakers last nearly as long as the patents.If the battery needs to be replaced, it is likely there is a new generation of pacemaker which would provide an upgrade as compared to the current one.The current payment system is focused on procedures, so there is a disincentive to have long-life products. It appears that the place wireless power can have a large impact are with high energy devices such as artificial hearts.

Power Plant Graduates Anchor Tenant

November 20, 2013

As published on www.ocalacep.com on November 20, 2013 Ocoos, the first start-up company to enter the Power Plant Business Incubator, will graduate from the facility on November 30. Founded in 2011, Ocoos offers a cloud-based platform which allows small businesses to create a web presence with mobile, e-commerce, CRM, analytics and scheduling capabilities. The company currently employs 10.   Dr. Rahul Razdan, Ocoos CEO, commented, “Our graduation comes at an exciting time in our company’s growth. In the last two years, Ocoos has had over 500 customers in its beta program. Now, we are shifting gears and accelerating business development. The Power Plant Business Incubator has been an important resource for us and we look forward to staying connected as it continues attracting, growing and graduating companies like our own.”   In 10-15 minutes, business owners can build a website on Ocoos, integrate social media and take advantage of a host of powerful tools. This allows...

The Tyranny of Software Margins

September 22, 2013

Software is an amazing product because the cost of “manufacturing” and distribution is essentially zero. With this business model, software companies can be run with incredibly high margins, and can be excellent generators of cash. The investment community certainly recognizes these characteristics for software companies and rewards them with higher multiples in the public equity markets. However, it is also true that many of these same companies have difficulty with building new innovative products.   Why? The answer is that high margins have a down-side.   For most software companies, one of the highest costs on the P&L is R&D. With efficient markets, the P&L of software companies is optimized at some base-line level of R&D spend. Thus, for any public software company to invest heavily in innovation, they must increase R&D spending, and thus directly impact the P&L in a material manner. Note, manufacturing companies do not have this issue because the costs of R&D are...

Qualcomm’s WiPower technology Starting to Gain Traction

August 18, 2013

Recently, Integrated Device Technology, Inc. (IDT) and Qualcomm announced a collaboration on the development of an IC for consumer electronics devices based on Qualcomm’s WiPower Technology. This news adds to the momentum for Qualcomm's technology in the marketplace. Over the last few months, Qualcomm's technology has been in the news on several occasions:  In June 2013, Qualcomm announced an agreement with Gill Industries to develop the wireless power market for the automotive and furniture segments (Read more)In May 2013, Samsung announced some devices based on the Qi Standard, but also indicated that Qualcomm's technology was the future (Read more) Overall, the first generation of technologies in wireless power has run their course. These technologies exemplified by the Qi Standard provided wireless power, but with very restrictive space constraints. The Qualcomm technology expands the degrees of freedom quite significantly, so it is not surprising to see that technology gain...

World Class Software Development for Technical Computing

February 22, 2013

Commercial software development in the field of technical computing has unique requirements which are exemplified by factors such as:   Extreme focus on run-time performance.Requirement of a high degree of responsiveness to the customer base.Continued focus on innovation.Concurrent support on multiple computing platforms.Most importantly, a very limited set of deep subject matter experts who have the skills to build the solution.   Exacerbating the above issues is the fact that the field has traditionally seen a great deal of merger activity which leads quickly to an accumulation of disjoint software development systems. We have published a paper (downloadable from the Document section at this link) describes a detailed case study of a methodology built within a leading technical computing company which achieved significant success by focusing relentlessly on enhancing the productivity of the individual developer. 

The Internet: Is There a Way it Can Actually Generate Jobs?

February 4, 2013

As published on www.prweb.com on February 4, 2013 Over the last 20 years, the Internet has impacted nearly every major industry in the economy. The profound changes that have occurred are well cataloged in Mary Meeker's annual KPMG presentation (Click Here). The impact of the Internet centers on two broad themes:   Efficient connection of buyers to sellers (ex: Amazon, Netflix, etc)Utilization of physical assets through software intelligence (ex: airbnb, zipcar.)   These technologies have made many brick-and-mortar markets much more efficient, and in the process generated enormous wealth for the enabling companies (Google, Facebook, Amazon, etc.). However, the negative impact has been a net loss of jobs, and further, the newly unemployed are largely in the non college educated sector. The contrast is stark between a company like Instagram ($1B valuation and 13 employees), and the chronic unemployment situation in the non-college sector.   Can the internet actually help...

EDA to private equity, part 2

January 8, 2013

​A surprising number of people seem to agree with a comment in my recent posting on EE Times, that EDA is run like a family business. I received a lot of feedback on this posting, including a number of recurring questions. Why would private equity be interesting? Is EDA interesting enough? If it's such a good idea, why is it not done? Is EDA software really sticky?  Isn’t the real problem competitive discounting? What about the debt?  Would this not be an issue much as it was for Freescale? What about Cadence (which negotiated to sell itself to private equity firms in 2007; the talks broke down)? For the answers (Click here)

Should private equity consolidate EDA?

January 8, 2013

​In the last decade, the three major EDA companies (Synopsys Inc., Cadence Design Systems Inc., and Mentor Graphics Corp.) have had a combined market capitalization which has stayed largely flat (approx $9.5 billion). In this same time period, the leading hamburger manufacturer, MacDonalds, grew shareholder value by 3X (from $28 billion to $99 billion) and the leading household goods manufacturer, Proctor and Gamble, grew shareholder value by over 50 percent (from $121 billion to $185 billion). This situation is obviously not good for shareholders, but the relatively stagnant state of the industry is also negative for customers and employees.   Read rest of article (Click here)

Florida Start-up Revolutionizing Marketing for Small Service Providers; Adventure Tourism is Focus of First Product Release

March 13, 2012

As published on www.prweb.com on March 13, 2012 Ocoos, http://www.ocoos.com, an Internet spin-out from the University of Florida, announces the release of their website for consumer use. Ocoos uses Internet technology to radically increase the reach and efficiency of small service providers using a vertical market approach. The adventure tourism market in Florida is a vertical market where a large number of small service providers are attempting to broaden their reach in the vast wilderness of the Internet. With this first release, Ocoos has built a platform which allows these service providers to integrate and amplify their messages with topics such as Kayaking, Fishing, Family Vacations, etc. Consumer Benefits:Ocoos helps the consumer wade through the “search” noise by building a subject matter specific platform which integrates content from services providers, noted experts, and social networking sources. With this platform, the consumer can learn from as well as contribute...

Empirical results from the transformation of a large commercial technical computing environment

October 15, 2009

Technical computing has unique requirements which are exemplified by factors such as: an extreme focus on run-time performance, a high degree of responsiveness to the customer base, a continued focus on innovation, concurrent support on multiple computing platforms and most importantly, a very limited set of deep subject matter experts who have the skills to build the solution. Exacerbating the above issues is the fact that the field has traditionally seen a great deal of mergers and acquisition activity which leads quickly to an accumulation of disjoint software development systems. This paper describes a detailed case study built within a leading technical computing company which achieved significant success by focusing relentlessly on enhancing the productivity of the individual developer. The work was driven by the author as the general manager of the organization, and measured results of the transformation will be presented in this paper.Full Article in Proceedings of the 2009...

Method and apparatus for enhancing the performance of event driven dynamic simulation of digital circuits based on netlist partitioning techniques

May 2, 2006

Disclosed is a full-chip level verification methodology that combines static timing analysis techniques with dynamic event-driven simulation. The specification discloses a capability to partition a multiple-clock design into various clock domains and surrounding asynchronous regions automatically and to determine the timing of the design on an instance by instance basis. Static timing analysis techniques can be leveraged to verify the synchronous cores of each clock domain. The asynchronous regions of the design and the interaction between synchronous cores of the clock domains are validated using detailed dynamic event-driven simulation without the burden of carrying the interior timing attributes of the synchronous cores that have already been verified (Full Patent Here).

Method and apparatus for critical and false path verification

March 30, 2004

A method and apparatus for critical and false path verification takes all the potential false paths and captures the conditions that would make them true paths (or false paths) as a Boolean expression (net list), for the combinational logic only. The net list does not have to be at the gate level, but can be a simplified gate level representation because the verification process is only concerned with the logical behavior, not the actual structure. This allows the simulation to execute more quickly. Since the conditions are only captured between register elements, it can be formally proved whether or not the path can be exercised. If no register value can activate the path, then the analysis is done. Otherwise, a simulation is performed to determine whether the register values required to active the condition actually occur. If the Boolean condition can be satisfied, the simulation is performed on the sequential logic to justify those values. If the satisfiability engine fails...

Method and apparatus for developing multiprocessor cache control protocols using an external acknowledgement signal to set a cache to a dirty state

November 18, 2003

A computer system includes an external unit governing a cache which generates a set-dirty request as a function of a coherence state of a block in the cache to be modified. The external unit modifies the block of the cache only if an acknowledgment granting permission is received from a memory management system responsive to the set-dirty request. The memory management system receives the set-dirty request, determines the acknowledgment based on contents of the plurality of caches and the main memory according to a cache protocol and sends the acknowledgment to the external unit in response to the set-dirty request. The acknowledgment will either grant permission or deny permission to set the block to the dirty state (Full Patent Here).

Method and apparatus for performing speculative memory fills into a microprocessor

December 10, 2002

According to the present invention a cache within a multiprocessor system is speculatively filled. To speculatively fill a designated cache, the present invention first determines an address which identifies information located in a main memory. The address may also identify one or more other versions of the information located in one or more caches. The process of filling the designated cache with the information is started by locating the information in the main memory and locating other versions of the information identified by the address in the caches. The validity of the information located in the main memory is determined after locating the other versions of the information. The process of filling the designated cache with the information located in the main memory is initiated before determining the validity of the information located in main memory. Thus, the memory reference is speculative (Full Patent Here).

Method and apparatus for delaying the execution of dependent loads

October 8, 2002

Load/ store execution order violations in an out-of-order processor are reduced by determining whether a source address of a load instruction is the same as a destination address of a store instruction on which execution the load instruction depends. If they are the same, then execution of the load instruction is delayed until execution of the store instruction. In an system where virtual registers are mapped to a physical register, the physical registers mapped by the store and load instructions are compared. A table has entries corresponding to instructions in an instruction queue. In each table entry corresponding to a store instruction, the store instruction's destination address offset and physical register reference are saved. A load instruction's source address offset and physical reference are compared with each of the table entries corresponding to store instructions to determine whether a dependency exists. Furthermore, a matrix also has row entries corresponding to...

Methods and apparatus for minimizing the impact of excessive instruction retrieval

September 3, 2002

A technique controls memory access requests. The technique involves acquiring a first series of requests including a prefetch request for performing a prefetch operation that prefetches a first set of instructions from a memory, and adding a first entry in a request queue in response to the prefetch request. The first entry identifies the prefetch operation. The technique further involves attempting to retrieve a second set of instructions from a cache to create a cache miss, and generating, in response to the cache miss, a second series of requests including a fetch request for performing a fetch operation that fetches the second set of instructions from the memory to satisfy the cache miss. The technique further involves acquiring the second series of requests that includes the fetch request, and adding a second entry in the request queue in response to the fetch request. The second entry identifies the fetch operation. The technique further involves invalidating the first entry...

Method and apparatus for optimizing bcache tag performance by inferring bcache tag state from internal processor state

June 4, 2002

An architecture which splits primary and secondary cache memory buses and maintains cache hierarchy consistency without performing an explicit invalidation of the secondary cache tag. Two explicit rules are used to determine the status of a block read from the primary cache. In particular, if any memory reference subset matches a block in the primary cache, the associated secondary cache block is ignored. Secondly, if any memory reference subset matches a block in the miss address file, the associated secondary cache block is ignored. Therefore, any further references which subset match the first reference are not allowed to proceed until the fill back to main memory has been completed and the associated miss address file entry has been retired. This ensures that no agent in the host processor or an external agent can illegally use the stale secondary cache data (Full Patent Here).

Method and apparatus for developing multiprocessor cache control protocols by presenting a clean victim signal to an external system

May 28, 2002

A multiprocessor system includes a plurality of processors, each processor having one or more caches local to the processor, and a memory controller connectable to the plurality of processors and a main memory. The memory controller manages the caches and the main memory of the multiprocessor system. A processor of the multiprocessor system is configurable to evict from its cache a block of data. The selected block may have a clean coherence state or a dirty coherence state. The processor communicates a notify signal indicating eviction of the selected block to the memory controller. In addition to sending a write victim notify signal if the selected block has a dirty coherence state, the processor sends a clean victim notify signal if the selected block has a clean coherence state (Full Patent Here).

Method and apparatus for developing multiprocessor cache control protocols using a memory management system generating atomic probe commands and system data control response commands

February 19, 2002

A memory management system couples processors to each other and to a main memory. Each processor may have one or more associated caches local to that processor. A system port of the memory management system receives a request from a source processor of the processors to access a block of data from the main memory. A memory manager of the memory management system then converts the request into a probe command having a data movement part identifying a condition for movement of the block out of a cache of a target processor and a next coherence state part indicating a next state of the block in the cache of the target processor (Full Patent Here).

Method and apparatus for developing multiprocessor cache control protocols using atomic probe commands and system data control response commands

November 6, 2001

A computing apparatus connectable to a cache and a memory, includes a system port configured to receive an atomic probe command or a system data control response command having an address part identifying data stored in the cache which is associated with data stored in the memory and a next coherence state part indicating a next state of the data in the cache. The computing apparatus further includes an execution unit configured to execute the command to change the state of the data stored in the cache according to the next coherence state part of the command (Full Patent Here).

Method and apparatus for resolving probes in multi-processor systems which do not use external duplicate tags for probe filtering

September 25, 2001

A processor of a multiprocessor system is configured to transmit a full probe to a cache associated with the processor to transfer data from the stored data of the cache. The data corresponding to the full probe is transferred during a time period. A first tag-only probe is also transmitted to the cache during the same time period to determine if the data corresponding to the tag-only probe is part of the stored data stored in the cache. A stream of probes accesses the cache in two stages. The cache is composed of a tag structure and a data structure. In the first stage, a probe is designated a tag-only probe and accesses the tag structure, but not the data structure, to determine tag information indicating a hit or a miss. In the second stage, if the probe returns tag information indicating a cache hit the probe is designated to be a full probe and accesses the data structure of the cache. If the probe returns tag information indicating a cache miss the probe does not proceed to...

Method and apparatus for a dedicated physically indexed copy of the data cache tag arrays

June 26, 2001

A data caching system and method includes a data store for caching data from a main memory, a primary tag array for holding tags associated with data cached in the data store, and a duplicate tag array which holds copies of the tags held in the primary tag array. The duplicate tag array is accessible by functions, such as external memory cache probes, such that the primary tag remains available to the processor core. An address translator maps virtual page addresses to physical page address. In order to allow a data caching system which is larger than a page size, a portion of the virtual page address is used to index the tag arrays and data store. However, because of the virtual to physical mapping, the data may reside in any of a number of physical locations. During an internally-generated memory access, the virtual address is used to look up the cache. If there is a miss, other combinations of values are substituted for the virtual bits of the tag array index. For external...

Method and apparatus for minimizing dcache index match aliasing using hashing in synonym/ subset processing

June 26, 2001

A data caching system comprises a hashing function, a data store, a tag array, a page translator, a comparator and a duplicate tag array. The hashing function combines an index portion of a virtual address with a virtual page portion of the virtual address to form a cache index. The data store comprises a plurality of data blocks for holding data. The tag array comprises a plurality of tag entries corresponding to the data blocks, and both the data store and tag array are addressed with the cache index. The tag array provides a plurality of physical address tags corresponding to physical addresses of data resident within corresponding data blocks in the data store addressed by the cache index. The page translator translates a tag portion of the virtual address to a corresponding physical address tag. The comparator verifies a match between the physical address tag from the page translator and the plurality of physical address tags from the tag array, a match indicating that data...

Distributed data dependency stall mechanism

June 19, 2001

A method and apparatus for preventing system wide data dependent stalls is provided. Requests that reach the top of a probe queue and which target data that is not contained in an attached cache memory, are stalled until the data is filled into the appropriate location in cache memory. Only the associated central processor unit's probe queue is stalled and not the entire system. Accordingly, the present invention allows a system to chain together two or more concurrent operations for the same data block without adversely affecting system performance (Full Patent Here).

Method and apparatus for minimizing pincount needed by external memory control chip for multiprocessors with limited memory size requirements

March 6, 2001

A computing apparatus has a mode selector configured to select one of a long-bus mode corresponding to a first memory size and a short-bus mode corresponding to a second memory size which is less than the first memory size. An address bus of the computing apparatus is configured to transmit an address consisting of address bits defining the first memory size and a subset of the address bits defining the second memory size. The address bus has N communication lines each configured to transmit one of a first number of bits of the address bits defining the first memory size in the long-bus mode and M of the N communication lines each configured to transmit one of a second number of bits of the address bits defining the second memory size in the short-bus mode, where M is less than N (Full Patent Here).

Method and apparatus for optimizing the performance of LDxL and STxC interlock instructions in the context of a write invalidate protocol

October 31, 2000

A technique for implementing load-locked and store-conditional instruction primitives by using a local cache for information about exclusive ownership. The valid bit in particular provides information to properly execute load-locked and store-conditional instructions without the need for lock flag or local lock address registers for each individual locked address. Integrity of locked data is accomplished by insuring that load-locked and store-conditional instructions are processed in order, that no internal agents can evict blocks from a local cache as a side effect as their processing, that external agents update the context of cache memories first using invalidating probe commands, and that only non-speculative instructions are permitted to generate external commands (Full Patent Here).

Distributed data dependency stall mechanism

July 4, 2000

A method and apparatus for preventing system wide data dependent stalls is provided. Requests that reach the top of a probe queue and which target data that is not contained in an attached cache memory subsystem, are stalled until the data is filled into the appropriate location in cache memory. Only the associated central processor unit's probe queue is stalled and not the entire system. Accordingly, the present invention allows a system to chain together two or more concurrent operations for the same data block without adversely affecting system performance (Full Patent Here).

Determining hardware complexity of software operations

March 7, 2000

A new class of general purpose computers called Programmable Reduced Instruction Set Computers (PRISC) use RISC techniques a basis for operation. In addition to the conventional RISC instructions, PRISC computers provide hardware programmable resources which can be configured optimally for a given user application. A given user application is compiled using a PRISC compiler which recognizes and evaluates complex instructions into a Boolean expression which is assigned an identifier and stored in conventional memory. The recognition of instructions which may be programmed in hardware is achieved through a combination of bit width analysis and instruction optimization. During execution of the user application on the PRISC computer, the stored expressions are loaded as needed into a programmable functional unit. Once loaded, the expressions are executed during a single instruction cycle (Full Patent Here).

Method and apparatus for maximizing utilization of an internal processor bus in the context of external transactions running at speeds fractionally greater than internal transaction times

July 13, 1999

Use of an internal processor data bus is maximized in a system where external transactions may occur at a rate which is fractionally slower than the rate of the internal transactions. The technique inserts a selectable delay element in the signal path during an external operation such as a cache fill operation. The one cycle delay provides a time slot in which an internal operation, such as a load from an internal cache, may be performed. This technique therefore permits full use of the time slots on the internal data bus. It can, for, example, allow load operations to begin at a much earlier time than would otherwise be possible in architectures where fill operations can consume multiple bus time slots (Full Patent Here).

Hardware extraction technique for programmable reduced instruction set computers

October 6, 1998

A new class of purpose computers called Programmable Reduced Instruction Set Computers (PRISC) use RISC techniques a basis for operation. In addition to the conventional RISC instructions, PRISC computers provide hardware programmable resources which can be configured optimally for a given user application. A given user application is compiled using a PRISC compiler which recognizes and evaluates complex instructions into a Boolean expression which is assigned an identifier and stored in conventional memory. The recognition of instructions which may be programmed in hardware is achieved through a combination of bit width analysis and instruction optimization. During execution of the user application on the PRISC computer, the stored expressions are loaded as needed into a programmable functional unit. Once loaded, the expressions are executed during a single instruction cycle (Full Patent Here).

Dynamically programmable reduced instruction set computer with programmable processor loading on program number field and program number register contents

December 9, 1997

A new class of general purpose computers called Programmable Reduced Instruction Set Computers (PRISC) use RISC techniques a basis for operation. In addition to the conventional RISC instructions, PRISC computers provide hardware programmable resources which can be configured optimally for a given user application. A given user application is compiled using a PRISC compiler which recognizes and evaluates complex instructions into a Boolean expression which is assigned an identifier and stored in conventional memory. The recognition of instructions which may be programmed in hardware is achieved through a combination of bit width analysis and instruction optimization. During execution of the user application on the PRISC computer, the stored expressions are loaded as needed into a programmable functional unit. Once loaded, the expressions are executed during a single instruction cycle (Full Patent Here).

Using pre-analysis and a 2-state optimistic model to reduce computation in transistor circuit simulation

December 2, 1997

Computational requirements are reduced for executing simulation code for a logic circuit design having at least some elements which are synchronously clocked by multiple phase clock signals, the logic design being subject to resistive conflicts and to charge sharing, the simulation code including data structures associated with circuit modules and nodes interconnecting the circuit modules. A three-state version of simulation code is generated for the circuit design, the three states corresponding to states 0, 1, or X, where X represents an undefined state. A preanalysis was performed of the three-state version and phase waveforms are stored each representing values occurring at a node of the code. For each phase of a module for which no event-based evaluation need be performed, an appropriate response to an event occurring with respect to the module of the three-state version is determined and stored. A two-state version of simulation code for the circuit design, the two states...

Formal implementation verification of the bus interface unit for the Alpha 21264 microprocessor

October 12, 1997

In this paper we present our method of formal verification of the transistor implementation of the Bus Interface Unit (BIU) of the Alpha 21264 microprocessor. We compare the logical description compiled from the Register Transfer Level (RTL) against that extracted from the custom-designed transistor-level schematics. BOVE, our BDD-based verification tool, does not require latch-to-latch correspondence, thus allowing the RTL to be more stable during the design process and giving the schematic designers freedom to implement race and timing optimizations. A unique "retiming" comparison algorithm efficiently compares partitions that include multiple pipeline stages, retiming optimizations and precharge logic. BOVE also verifies small finite-state machines that have different state encodings in the RTL and schematic.Full Article in  Proceedings., 1997 IEEE International Conference on Computer Design: VLSI in Computers and Processors, 1997. ICCD'97.

The Alpha 21264: a 500 MHz out-of-order execution microprocessor

February 23, 1997

The paper describes the internal organization of the 21264, a 500 MHz, out of order, quad fetch, six way issue microprocessor. The aggressive cycle time of the 21264 in combination with many architectural innovations, such as out of order and speculative execution, enable this microprocessor to deliver an estimated 30 SpecInt95 and 50 SpecFp95 performance. In addition, the 21264 can sustain 5+ Gigabytes/sec of bandwidth to an L2 cache and 3+ Gigabytes/sec to memory for high performance on memory-intensive applications.Full Article in Compcon'97. Proceedings, IEEE

Simulation of circuits

August 27, 1996

Computational requirements are reduced for executing simulation code for a logic circuit design having at least some elements which are synchronously clocked by multiple phase clock signals, the simulation code including data structures associated with circuit modules and nodes interconnecting the circuit modules. The simulation code is preanalyzed and phase waveforms are stored each representing values occurring at a node in successive phases. Based on the preanalysis, modules are categorized in a first category, for which an event-based evaluation is to be performed in each phase of the simulation, and a second category for which no event-based evaluation need be performed in at least one but not all phases. For each phase of a second category module, an appropriate response to an event occurring with respect to the module is determined. A data structure is then included in the simulation code, having an entry for each module of the code for controlling the phases in...

High capacity netlist comparison

October 31, 1995

A method for determining whether multiple representations of a design of a circuit are consistent with each other, where the circuit includes multiple devices with channels for conducting electrical current. Each representations includes a list of device elements that describe the devices and node elements that describe the nodes which interconnect the devices. The method includes modifying each of the lists by: (1) analyzing the device elements and the node elements to identify at least one channel connected region of said circuit (where a channel connected region includes the subset of the devices that have channels interconnected by a subset of the nodes), (2) defining, for each channel connected region, a channel connected region element that describes the subset of the devices and the subset of the nodes in the region, and (3) replacing the device elements of each subset of devices and the node elements of each subset of nodes in the lists with the channel connected...

A high-performance microarchitecture with hardware-programmable functional units

November 30, 1994

This paper explores a novel way to incorporate hardware-programmable resources into a processor microarchitecture to improve the performance of general-purpose applications. Through a coupling of compile-time analysis routines and hardware synthesis tools, we automatically configure a given set of the hardware-programmable functional units (PFUs) and thus augment the base instruction set architecture so that it better meets the instruction set needs of each application. We refer to this new class of general-purpose computers as PRogrammable Instruction Set Computers (PRISC). Although similar in concept, the PRISC approach differs from dynamically programmable microcode because in PRISC we define entirely-new primitive data path operations. In this paper, we concentrate on the micro architectural design of the simplest form of PRISC—a RISC microprocessor with a single PFU that only evaluates combinational functions. We briefly discuss the operating system and the programming...

PRISC software acceleration techniques

October 10, 1994

Programmable reduced instruction set computers (PRISC) are a new class of computers which can offer a programmable functional unit (PFU) in the context of a RISC datapath. PRISC create application-specific instructions to accelerate the performance for a particular application. Our previous work has demonstrated that peephole optimizations in a compiler can utilize PFU resources to accelerate the performance of general purpose programs. However these compiler optimizations are limited by the structure of the input source code. This work generalizes on our previous work, and demonstrates that the performance of general abstract data types such as short-set vectors, hash tables, and finite state machines is significantly accelerated (250%-500%) by using PFU resources. Thus, a wide variety of end-user applications can be specifically designed to use PFU resources to accelerate performance. Results from applications in the domain of computer-aided design (CAD) are presented to...

PRISC: Programmable reduced instruction set computers

January 1, 1994

This thesis introduces Programmable Reduced Instruction Set Computers (PRISC) as a new class of general-purpose computers. PRISC use RISC techniques as a base, but in addition to the conventional RISC instruction resources, PRISC offer hardware programmable resources which can be configured based on the needs of a particular application. This thesis presents the architecture, operating system, and programming language compilation techniques which are needed to successfully build PRISC. Performance results are provided for the simplest form of PRISC -- a RISC microprocessor with a set of programmable functional units consisting of only combinational functions. Results for the SPECint92 benchmark suite indicate that an augmented compiler can provide a performance improvement of 22% over the underlying RISC computer with a hardware area investment less than that needed for a 2 kilobyte SRAM. In addition, active manipulation of the source code leads to significantly higher local...

Clock suppression techniques for synchronous circuits

October 1, 1993

A clock suppression based technique that takes advantage of the higher abstraction level provided by synchronous design techniques to improve logic simulation performance was given by the authors (see Proc. IEEE Int. Conf. on Comput. Aided Des. Integr. Circuit Syst., pp.62-65, 1990). Here, the authors elaborate on those techniques and present extensions that can offer an average performance increase of over 5* and a peak performance increase of over 10* that of a conventional logic simulator. The viability of the approach is shown by presenting results from switch-level simulations of large industrial examples. It is shown that because clock suppression based techniques are CPU-bound, they can take advantage of the recent explosive growth of CPU performance.Full Article in IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems.

HCNC: High capacity netlist compare

September 5, 1993

The author describes HCNC (High Capacity Netlist Compare), a technique for netlist comparison that uses the natural hierarchy of the transistor circuit to significantly increase the capacity of traditional netlist comparison algorithms. Since the natural hierarchy of the circuit is used, HCNC does not require hierarchical information from the user. HCNC also has some desirable properties for error recovery. Results from the network comparison of several large industrial circuits shall show the viability of this algorithm.Full Article in Proceedings of the IEEE 1993, Custom Integrated Circuits Conference, 1993

Automatic detection of MOS synchronizers for timing verification

November 11, 1991

Static timing verifiers need to know at which points data are synchronized with clocks in a circuit. Typically, this happens at latches and in clock qualification gates. However, in a general, full-custom VLSI methodology, the 'latch-equivalents' are far more varied and difficult to detect reliably. The authors define these synchronization points, and present provably robust algorithms to locate them in a very general class of MOS networks, including arbitrary pass gates. The algorithms have been applied to a variety of full-custom CPUs of up to 500 K devices, and have been found to work extremely reliably and quite fast.Full Article in IEEE International Conference on Computer-Aided Design, 1991. ICCAD-91. Digest of Technical Papers., 1991

Concurrent min-max simulation

February 25, 1991

Parametric process variations, which are inherent in the manufacture of complex digital circuits, can cause variations in the timing characteristics of a digital device. These device timing variations can cause catastrophic failures to the intended logical operation of the whole design. Min-Max Timing Simulation is a simulation technique which is well suited to verify that a given design functions correctly, even under the influence of parametric process variations. Unfortunately, in the past, Min-Max Timing Simulation has been very expensive in simulation CPU time and in the amount of memory consumed. We present a technique, Concurrent Min-Max Simulation (CMMS), which employs the techniques developed in Concurrent Fault Simulation, to elegantly solve the Min-Max Timing simulation problem.Full Article in Proceedings of the conference on European design automation

Exploitation of periodicity in logic simulation of synchronous circuits

November 11, 1990

An overwhelming majority of logic designers use synchronous logic design techniques to manage the complexity of their designs and rely on logic simulation techniques for design verification. Yet, logic simulators do not take advantage of the higher abstraction level provided by synchronous logic design techniques to improve their performance. A general technique is presented which takes advantage of the high degree of periodicity common in synchronous logic designs. It is shown that a performance improvement of at least 200% occurs when these techniques are applied within the COSMOS. simulation system to simulate large digital systems.Full Article in IEEE International Conference on Computer-Aided Design, 1990. ICCAD-90. Digest of Technical Papers., 1990

A global feedback detection algorithm for VLSI circuits

September 17, 1990

A global feedback detection algorithm for VLSI circuits is presented. It can identify all the global feedback loops within reasonable computational time. The overall algorithm is as follows: First, all the strongly connected components (SCC) are found using a modified version of the Tarjan algorithm which can handle circuits with flip-flops and latches. Second, each SCC recursively cuts the loops based on heuristic criteria to reduce computation time and space until all loops inside this SCC are out. The modified Tarjan algorithm for finding SCCs in circuits consisting of functional primitive elements such as flip-flops and latches is described. A recursive loop-cutting algorithm for strongly connected components is presented, and a top-level partitioning scheme to reduce memory requirements and computation time for finding global feedback loops is proposed.Full Article in ICCD'90 Proceedings, 1990 IEEE International Conference on Computer Design: VLSI in Computers and Processors,...

An interactive sequential test pattern generation system

August 29, 1989

The authors present ITPG (Interactive Test Pattern Generator), an automatic test pattern generation tool which produces high fault coverage for complex sequential circuits. The tool is more successful than previous attempts at sequential test generation because of the innovative heuristics and high-level sequential primitives used in the system. Old heuristics, such as controllability and observability, have been extended to the sequential world, and a new heuristic, grouping, has been added to accelerate sequential test pattern generation. In addition, the tool allows the designer to influence the test generation process, thus resulting in the 'interactive' nature of the tool. Results from real industrial VLSI circuits show the effectiveness of this tool.Full Article in International Test Conference Proceedings, Meeting the Tests of Time 1989

A statistical design rule developer

October 1, 1986

In this paper, a general methodology for design rule development and the CAD tool which implements this methodology, Statistical Design Rule Developer (STRUDEL), are presented. The focus of the proposed approach is the concept of a statistical design rule, which is defined as a geometric design rule with an associated probability of failure. Global lateral variations obtained from FABRICS, and local spot defects obtained from measurements are taken into account when calculating the probability of failure. A failure model which accounts for catastrophic faults has been enhanced to include some parametric faults. STRUDEL can be used as a guide to generate layout design rules and may be extended to a wide range of applications including coarse yield estimation during design rules check.Full Article in IEEE transactions on computer-aided design of integrated circuits and systems