Skip to content

EECS Facilities Statement

The Department of Electrical Engineering and Computer Science (EECS) relocated to the new Min H. Kao Electrical Engineering and Computer Science building in December 2011. The $37.5M facility houses 165,000 square feet of faculty, staff, and student offices, conference rooms, academic and research laboratories, and classrooms. Eleven laboratories are dedicated to engineering courses such as Signals and Systems, Networking, Circuits, Electronics, and Embedded Systems. Three laboratories are used for Senior Design projects. Two computer laboratories are available for instructional purposes:

  • Hydra – 31 Dell OptiPlex 7040 computers with 3.4 GHz Intel Core i7-6700 processors, 16GB SDRAM and nVidia GeForce GT 745 GPUs. The computers run the Red Hat Enterprise Linux 7. Installed applications include CodeLite, CUDA, Eclipse, GIMP, NetBeans, MATLAB, Synopsys Saber and a variety of open-source software applications and development environments.
  • Tesla – 31 Dell Optiplex 9020 computers with 3.4 GHz Intel Core i5-4670 Processors, 8GB 1600MHz DDR3 SDRAM and nVidia GeForce GTX 645 GPUs. The computers run the Red Hat Enterprise Linux 7. Installed applications include CodeLite, CUDA, Eclipse, GIMP, NetBeans, MATLAB, Synopsys Saber and a variety of open-source software applications and development environments.

Various Windows laboratories run the Microsoft Windows 10 operating system. Installed applications include Microsoft Office, Visual Studio, Agilent, Ansoft, AutoCAD, Cadence, CodeLite, Comsol, CST Studio, Digilent, LabVIEW, LTSpice, Maple, MATLAB and Xilinx ISE.

Instructional laboratories and conference rooms have video projection capabilities and/or LED displays.

Min H. Kao network technologies include switched 100/1000 Mbps Ethernet and two wireless networks:

  • ut-open – an unsecured network available to faculty, staff, students, and visitors. A new portal will be put in place to manage network registration and visitor access.
  • eduroam – a secure network available to faculty, staff, and students. In addition to wireless access at UT, faculty, staff, and students are able to obtain Internet connectivity when visiting other participating eduroam institutions.

The EECS departmental infrastructure provides LDAP authentication, web, file/print, and database services.

Local research facilities include:

  • ADA – Dell PowerEdge servers running a broad selection of licensed engineering applications including products from Agilent, Altium, Ansys, Cadence, Comsol, EMA, GE, Mentor Graphics, Manitoba, Plexim, Siemens, Sonnet, and Synopsys.
  • Cuda – a 4-node cluster composed of Dell Precision T7500n 2×2.26GHz quad-core Xeon “Gainestown” Nehalem E5520 processors (32 cores total). Each Precision is equipped with one nVidia Tesla C1060 graphics processing unit.
  • Analytics – an IBM PureApplication W1500 (x86) and IBM Flex System V7000 Storage system. The PureApplication is a pre-configured platform for platform as a service (PaaS) for transaction-oriented web and database applications. The PureApplication is part of a multidisciplinary proof of concept project involving EECS, the Haslam College of Business and IBM.

Remote research facilities available to EECS members include:

  • The University of Tennessee Newton HPC Program operates a number of computing systems with different capabilities and characteristics. All Newton compute clusters are accessible through a single login node, and all clusters use the same operating system, software environment, and job queue system.
    • Sigma Cluster – the largest and most powerful Newton computational resource. It is a 108-node Lenovo NeXtScale cluster based on FDR Infiniband with Intel Haswell CPUs. The cluster is rated as a peak performance of 112 TFLOPS.
    • Monster – a shared memory SMP system with 1 TB of RAM and support for 48 CPU threads with Intel Broadwell CPUs. This system is designed to facilitate jobs that require a very large in-RAM data set in a single shared memory space.
    • Rho Cluster – a GPGPU cluster with 48 compute nodes each hosting an Nvidia Tesla GPGPU accelerator card. It has an accelerated peak performance of 80 TFLOPS.
    • Chi Cluster – a 1728 CPU-core AMD compute cluster based on QDR Ininfiband.
    • Phi Cluster – a 864 CPU-core Intel compute cluster based on QDR Infiniband.
  • The National Institute for Computational Sciences (NICS) at the University of Tennessee, Knoxville is one of the leading high performance computing centers for excellence in the United States. NICS is co-located on the University of Tennessee, Knoxville campus as well as in the Secret City of Oak Ridge, on the Oak Ridge National Laboratory Campus (ORNL), the world’s most powerful computing complex. The center’s missions is to expand the boundaries of human understanding while ensuring the United States’ continued leadership in science, technology, engineering, and mathematics. Computational resources include:
    • Darter is a Cray XC30 system with Aries and Sonexion technology for the interconnect and storage respectively, that provide both high scalability and sustained performance.
    • Beacon is an energy efficient cluster that utilizes Intel® Xeon Phi™ coprocessors. Beacon provides 768 conventional cores and 11,520 accelerator cores that provide over 210 TFLOP/s of combined computational performance, 12 TB of system memory, 1.5 TB of coprocessor memory, and over 73 TB of SSD storage, in aggregate.

The University of Tennessee-Knoxville’s Office of Information Technology (OIT) is responsible for managing a complex network that spans the campus and remote sites throughout the state. OIT Network Services’ responsibilities include the management of the local UTK network (both wired and wireless), a range of wide area connections, and Internet/Internet2 connectivity. In addition, the group manages various security devices such as firewalls, VPN (Virtual Private Network) access devices, IPS and IDS devices (Intrusion Prevention/Detection Systems). At its core, the UTK network supports speeds of 10 gigabits per second (Gbps) and greater. Buildings are dual-connected with at least 1 Gbps, with many featuring 10 Gbps connections to the core in order to accommodate applications that require very high bandwidth. Overall, the campus network can support approximately 36,000 100 Mbps connections, 28,000 Gigabit Ethernet connections and 8,600 Ten Gigabit Ethernet connections.

A state of the art wireless network consisting of more than 4,500 wireless access points spans the entire campus. A major upgrade to the wireless infrastructure was completed in the summer of 2015 supporting 802.11a/g (54 Mbps), 802.11n (450 Mbps) and 802.11ac (1.3 Gbps). Remote offices are linked to the main campus network using connections ranging from 1.5 to 100 Mbps. Such locations include offices in Knoxville, Oak Ridge, Nashville, and Agriculture Extension facilities at various locations in Tennessee. Multiple, redundant connections provide the UTK community with access to the Internet at an aggregate speed of more than 10 Gbps. Multi-Gbps networks connect UTK to research facilities at ORNL (Titan supercomputer) and Internet 2 sites via Southern Crossroads (http://www.sox.net). UTK is a Sponsored Education Group Participant (SEGP), a program that allows us to sponsor other educational institutions’ access to Internet2 resources. Two core firewalls and a multitude of smaller, department-size firewalls, are deployed and managed by Network Services. Secure remote access is facilitated through VPN access devices (appliances) that support both IPSec and SSL/VPN access.

Updated January 2017.

The flagship campus of the University of Tennessee System and partner in the Tennessee Transfer Pathway.