On-Chip Network Architecture for Billion Transistor Era

Do you have a question? Post it now! No Registration Necessary

Translate This Thread From English to

Hi

I'm studying network-on-chip architectures suitable for consumer
applications in 2005-2010 and I'm looking for opinions on this topic.

In the past months, I have read 10s of publications about possible network
topologies and services for 100s million gate ICs. Most of the principles
and mechanisms come from the already existing off-chip networks for
multiprocessors systems and are mainly based on best-efforts services. Only
a few publications introduce some ideas to offer both best-effort and
guaranteed services in the same IC in order to offer proper support for
strict realtime data streaming (for multimedia or communication for
instance).

When taking into account the specific characteristics of many current
system-on-chips for consumer apps (small silicon area, low power
consumption, strict high bandwidth and/or low latency requirements), it
appears that offering various kind of guaranteed services over a distributed
network at a low cost is very difficult, if possible.

Some of the tricks developped for multiprocessor systems might be of
particular interest for system-on-chips : use of out-of-order processor w/
2-level cache memories to better hide the network latency, use of higher
complexity network router to better track connections between network
endpoints and to better resolve network contention, ... But if we look
carefully, it gives the feeling that they are hidding the fact that the
current network-on-chips approach is not suitable given the specific IC
constraints. We are wasting resources because they do not bring any
significant added value to the consumer (on the consumer market (w/ few
1s-10s $ ICs), the consumer is looking for the maximum added value at the
minimum cost i.e. the minimum complexity).

I think that the main mistake consists in keeping the same approach of the
IC design too long. Traditionnally, IC architects and designers develops IC
w/ 100% internal data integrity and therefore spend more and more time to
develop and validate functional blocks to support this. Considering the
unpredictable nature of distributed networks, can we continue in this way?

I don't think so and I'm wondering if there exists some attempts to design
ICs for consumer multimedia/communication applications w/ the use of
unreliable network services offering only statistical guarantees to the
network endpoints over a 10s-100s million gate network-on-chip. Indeed
statistical guarantees might be of interest as the main criteria is the
consumer perception of the IC features and performances (for instance, for
video applications, the consumer do not pay attention to all pixels of all
frames displayed by a TV screen).

To my opinion statistical network services are key elements for future IC
products as their intrinsic characteristics might be more suitable for all
upcoming applications which will require a better integration in the human
environnement (such as the pervasive computing field) whose characteristics
are extreme unreliability and dynamics.

Don't hesitate to send comments or suggestions.

Eric




Site Timeline