Sensors Processor
Cambridge CMOS Sensors CCS811 NTC Thermistor接口指南说明

Cambridge CMOS Sensorsis nowMember of theams GroupContact information:Headquarters:ams AGTobelbader Strasse 308141 Premstaetten, AustriaTel: +43 (0) 3136 500 0e-Mail: *****************Key Benefits∙Simple circuity fordetermining temperature ∙Cost effective andminimal PCB footprint∙Integrated MCU with ADC ∙I2C digital interface∙Optimised low-powermodes∙Compact 2.7x4.0 mm LGA package∙Proven technologyplatform∙On-board processing to reduce requirement onhost processor∙Fast time-to-market∙Extended battery life∙Reduced componentcount∙Suitable for small form factor designs∙Highly reliable solution Applications∙IAQ monitoring forSmarthome, Smartphonesand accessoriesVoltage Divider and ADCTo enable the NTC circuit a voltage divider circuit is constructed between the VDD, AUX and Sense pins. Refer to Figure 1 below.Figure 1 CCS811 NTC Voltage Divider CircuitThe CCS811 has an internal ADC that can measure the voltages at VDD, AUX and Sense, enabling the CCS811 to calculate the voltages across the Reference Resistor (R REF) and the NTC Resistor (R NTC).For optimal usage of ADC resolution, the value of RREF and the nominal value of RNTC should be approximately the same; the suggested value is 100kΩ.As temperature increases, RNTC decreases. This causes the voltage value sampled on the AUX signal to decrease. The opposite is true when temperature decreases, RNTC increases.Hardware Connection When NTC Is Not RequiredIf the system where the CCS811 is deployed uses another means of obtaining temperature, such as combined temperature and humidity sensor, or if temperature is not measured in the system, then there is no requirement to connect the AUX and Sense pins to the thermistor. There is also no requirement to connect the reference resistor between AUX and VDD. Pins 4 and 5 must still be connected together for normal operation.Determining Thermistor ResistanceThe CCS811 retrieves samples on the Sense and AUX ADCs concurrently, thus obtaining the two voltages, V REF and V NTC. This allows the user to determine the resistance of the NTC thermistor. The following proof using Ohm’s Law can be therefore be used to find the resistance, R NTC , of the NTC thermistor.Iref =VrefRrefIntc =VntcRntcAs Iref = IntcVref Rref =VntcRntcTℎerfore Rntc = Vntc ∗RrefVrefEquation 1 R NTC ProofThe CCS811 NTC mailbox (ID = 6) provides the application with the V REF and V NTC values sampled by the AUX and Sense pins respectively. The format of this data is shown in Table 2 NTC Mailbox Data Encoding.Table 2 NTC Mailbox Data EncodingBoth V REF and V NTC are in millivolts. As R REF is a discrete component with a known magnitude of resistance, and as the CCS811 has provided the V REF and V NTC values it is therefore very simple to determine the R NTC value. In the simplest case when V REF and V NTC are equal, R NTC is equal to R REF .EquationsUsing the Simplified Steinhart Equation to Determine TemperatureAfter determining the value of R REF the thermistor’s data sheet must be consulted in order to understand how this can be used to calculate the ambient temperature. The most common method for this is to use a simplified version of the Steinhart equation. In that case the data sheet will contain an equation of the form shown in Equation 2 Simplified Steinhart Equation.B=log(RRo) 1T−1ToEquation 2 Simplified Steinhart EquationThe equation contains a number of parameters that are found in the NTC thermistor’s data sheet. It also contains some parameters that the user must provide. These are described below in Table 3 Simplified Steinhart Equation Parameter Descriptions.Table 3 Simplified Steinhart Equation Parameter DescriptionsNote that B, T O and R O are constant values that can be found in the thermistor’s data sheet. Also observe that R is the resistance of the thermistor at the current temperature. This value is available to the application after reading the data in the CCS811 NTC mailbox and using Equation 1. Therefore the only unknown value is the temperature, T. This allows Equation 2 to be solved for T by rearranging as shown in Equation 3 Temperature Calculation.1 T =1To+1Blog(RRo)Equation 3 Temperature CalculationThe temperature can then be calculated in software as described in the subsequent sections.P a g e|4© Cambridge CMOS Sensors Ltd, Deanland House, Cowley Road, Cambridge, CB4 0DL, UKThe first step in calculating temperature is to perform a read to the NTC mailbox, this will look something similar to the following, adapt accordingly to the applications drivers and API:i2c_write(CCS_811_ADDRESS, NTC_REG, i2c_buff, size=0);i2c_read(CCS_811_ADDRESS, i2c_buff, size=4);The first I2C transaction is a setup write to the NTC mailbox (the argument NTC_REG has the value 6) with no data. This is followed by a 4 byte read, that access the NTC mailbox and stores these 4 bytes of data to a byte/character array called i2c_buff. Please see CC-000803 Programming and Interfacing Guide for more details on handling the CCS811 I2C interface and timing requirements.The V REF and V NTC in i2c_buff can then be passed to a function that implements Equation 1.#define RREF 100000rntc = calc_rntc((uint16_t)(i2c_buff[0]<<8 | i2c_buff[1]),(uint16_t)(i2c_buff[2]<<8 | i2c_buff[3]));uint32_t calc_rntc(uint16_t vref, uint16_t vntc){return (vntc * RREF / vref);}The value of RREF is the R O taken from the thermistors data sheet. As i2c_buff is an array of chars in this example it will have to be converted to 2x16 bit scalars in order to be passed to the function. It is recommended to do the shifting as this will work on both a big and little endian host processor.The returned rntc value can then be used to determine the temperature.Application Software Running on a CPU with Floating Point SupportIf the CPU running the application has floating point support and sufficient program memory for the floating point calculations and/or library functions then the c standard math library can be used to help implement Equation 3.An example is shown below:#define RNTC_25C 100000#define BCONSTANT 4250#define RNTC_TEMP 25double calc_temp_from_ntc(uint32_t rntc){double ntc_temp;ntc_temp = log((double)rntc / RNTC_25C); // 1ntc_temp /= BCONSTANT; // 2ntc_temp += 1.0 / (RNTC_TEMP + 273.15); // 3ntc_temp = 1.0 / ntc_temp; // 4ntc_temp -= 273.15; // 5return ntc_temp;}Recall Equation 3:1 T =1To+1Blog(RRo)The application developer can extract from thermistors data sheet the constant values in order to solve for temperature. RNTC_25C, BCONSTANT and RNTC_TEMP correspond to R O, B and T O respectively. These can then be written to a c header file used in the application software.Comments for each of the 5 steps in the software example above are as follows:1.Calculate log(R/R O) using the maths library, use the R NTC value calculated from Equation 12.Divide log(R/R O) by the thermistor’s B constant3.Add 1/T O to the interim result, the equation requires all temperatures are in Kelvin. Adding 273.15 toT O converts the value in the thermistor’s data sheet, normally 25o C (RNTC_TEMP), to Kelvin4.The result of 3 is the reciprocal of the temperature so this step yields the current temperature inKelvin5.Convert from Kelvin to o CApplication Software Running on a CPU With No Floating Point SupportIf the application CPU does not have floating point support or there is insufficient program memory available for the library and/or floating point calculations, then the temperature can be determined using linear interpolation between two point on the thermistor’s temperat ure versus resistance curve. Finding the two points can be done as follows:1.The thermistors data sheet can be consulted to find the resistance at various points on the graph,normally in increments of 5o C2.Pre-calculate the resistances at various temperatures required by the application in increments of x o C,where x is application specific.The application software can store the resistance values in an array or lookup table and use the resistance, R NTC calculated using Equation 1, as an input into the look up table. The look up operation must be programmed to return the two resistance points. Basically the goal is to determine which two resistance values R NTC lies between in the lookup mechanism used by the application. Then linear interpolation can then be used to determine a reasonably accurate temperature value.For example assuming that a 100kΩ therm istor has a resistance of 210k at 10o C and 270k at 5o C. Let’s also assume Equation 1 yielded a value of 222000Ωfor R NTC. The following can be used to approximate the temperature.1.The application must determine how many steps between the two points are required. Forexample assuming 100 steps then subtract the high and low resistance and divide by 100:step_size =(270000 – 210000) / 100 = 600Thus each 0.05o C increment in temperature corresponds to 600Ω decrease in resistance betweenthese 2 points2.Calculate how may steps R NTC is from the higher resistance:rntc_steps =(270000 – 222000) / step_size = 803.To avoid errors using integer types, use a few orders of magnitude greater: i.e. lower temp*1000(5x1000 = 5000). As each step is 50 milli o C (0.05x1000) the temperature is therefore:T = 5000 + rntc_steps∗50 = 9000,i.e.9 degrees Celsius.The temperature determined using the NTC thermistor circuit and the equations in the sections above can be used for temperature compensation on the CCS811.When writing the temperature to the ENV_DATA register it is necessary to also write the humidity. If the humidity is not known then the default value corresponding to 50% relative humidity must be written to ENV_DATA. For example if the temperature is 30o C and no RH data is available then the user must write four bytes as follows:0x64, 0x00, 0x6E, 0x00The first two bytes are the RH data in the format required by the CCS811 ENV_DATA register. The next two bytes are the temperature +25o C in the format required by ENV_DATA. Please consult the data sheet for more information. Additionally a full example of using ENV_DATA is available in application note CC-000803-AN Programming and Interfacing Guide.The contents of this document are subject to change without notice. Customers are advised to consult with Cambridge CMOS Sensors (CCS) Ltd sales representatives before ordering or considering the use of CCS devices where failure or abnormal operation may directly affect human lives or cause physical injury or property damage, or where extremely high levels of reliability are demanded. CCS will not be responsible for damage arising from such use. As any devices operated at high temperature have inherently a certain rate of failure, it is therefore necessary to protect against injury, damage or loss from such failures by incorporating appropriate safety measuresAbbreviationsReferences。
我的小发明海洋垃圾处理器作文

我的小发明海洋垃圾处理器作文英文回答:My invention is an ocean garbage processor, which is designed to tackle the growing problem of marine pollution. The processor utilizes advanced technology to collect, sort, and dispose of various types of ocean debris, such as plastics, metals, and organic waste.Firstly, the processor is equipped with a powerful suction system that can efficiently collect floating garbage from the surface of the water. It uses acombination of sensors and filters to identify and separate different types of waste. For example, plastics are sorted out and stored separately, while organic waste is processed to produce biogas or compost.Secondly, the processor has a built-in recycling unit that can convert certain types of waste, such as plastics and metals, into reusable materials. This not only helps toreduce the amount of garbage in the ocean but also promotes a more sustainable approach to waste management.Additionally, the processor is designed to minimize its impact on marine life. It is equipped with underwater cameras and sensors to detect the presence of marine animals and avoid any harm to them. Furthermore, it has a mechanism to release any trapped animals back into their natural habitat unharmed.In conclusion, my ocean garbage processor is an innovative solution to the problem of marine pollution. It not only collects and disposes of ocean debris but also promotes recycling and protects marine life. With the implementation of this invention, we can make significant progress in cleaning up our oceans and preserving the delicate balance of marine ecosystems.中文回答:我的小发明是一台海洋垃圾处理器,旨在解决日益严重的海洋污染问题。
机器视觉英文词汇

机器视觉英文词汇机器视觉英文词汇Aaberration 像差accessory shoes 附件插座、热靴accessory 附件achromatic 消色差的active 主动的、有源的acutance 锐度acute-matte 磨砂毛玻璃adapter 适配器advance system 输片系统ae lock(ael) 自动曝光锁定af illuminatoraf 照明器af spotbeam projectoraf 照明器af(auto focus) 自动聚焦algebraic operation 代数运算一种图像处理运算,包括两幅图像对应像素的和、差、积、商。
aliasing 走样(混叠)当图像象素间距和图像细节相比太大时产生的一种人工痕迹。
alkaline 碱性ambient light 环境光amplification factor 放大倍率analog input/output boards 模拟输入输出板卡analog-to-digital converters 模数转换器ancillary devices 辅助产品angle finder 弯角取景器angle of view 视角anti-red-eye 防红眼aperture priority(ap) 光圈优先aperture 光圈apo(apochromat) 复消色差application-development software 应用开发软件application-specific software 应用软件apz(advanced program zoom) 高级程序变焦arc 弧图的一部分;表示一曲线一段的相连的像素集合。
area ccd solid-state sensors 区域ccd 固体传感器area cmos sensors 区域cmos传感器area-array cameras 面阵相机arrays 阵列asa(american standards association) 美国标准协会asics 专用集成电路astigmatism 像散attached coprocessrs 附加协处理器auto bracket 自动包围auto composition 自动构图auto exposure bracketing 自动包围曝光auto exposure 自动曝光auto film advance 自动进片auto flash 自动闪光auto loading 自动装片auto multi-program 自动多程序auto rewind 自动退片auto wind 自动卷片auto zoom 自动变焦autofocus optics 自动聚焦光学元件automatic exposure(ae) 自动曝光automation/robotics 自动化/机器人技术automation 自动化auxiliary 辅助的Bback light compensation 逆光补偿back light 逆光、背光back 机背background 背景backlighting devices 背光源backplanes 底板balance contrast 反差平衡bar code system 条形码系统barcode scanners 条形码扫描仪barrel distortion 桶形畸变base-stored image sensor (basis) 基存储影像传感器battery check 电池检测battery holder 电池手柄bayonet 卡口beam profilers 电子束仿形器beam splitters 光分路器bellows 皮腔binary image 二值图像只有两级灰度的数字图像(通常为0和1,黑和白)biometrics systems 生物测量系统blue filter 蓝色滤光镜blur 模糊由于散焦、低通滤波、摄像机运动等引起的图像清晰度的下降。
CTU 开发套件用户指南说明书

DescriptionCambridgeIC’s Central Tracking Unit (CTU) is a single chip processor for sensing linear and rotary position. CTU chips work with resonant inductive position sensors. These are manufactured with standard PCB technology. This means sensors are stable, robust and cost effective. Sensors are available in a number of measuring lengths and configurations.Sensors work with contactless targets that comprise an electrical resonator sealed inside a precision housing. CambridgeIC’s standard target is manufactured by Epcos AG, Europe’s leading supplier of passive components.The CTU Development Kit includes all of the parts needed to get a CTU position sensing system working. It includes a USB interface and software for a PC, for demonstration and evaluation. Alternatively, theCAM204 chip’s interfaces are available on a 14-pin IDC connector. This enables the system to be interfaced with the customer’s own host system during later development. Kit Features•CTU Development Board (CAM204 chip)• 4 x Type 1 linear sensors from 25mm to 200mm • 3 x Type 1 360° rotary sensors• 4 x Targets•CTU Adapter for SPI to USB conversion•PC software for Windows XP/Vista•Ready to work inside the box Applications•Demonstration•Evaluation•Development•One-off position sensing solutionsProduct identificationPart no. Description013-7002 CTUDevelopmentKitFigure 1 CTU Development Kit1Quick Start Guide1.1Start with Kit Contents in the BoxThe CTU Development Kit is designed to work inside the box for preliminary demonstration and evaluation. Only the CD and the PC end of the USB cable need be removed. The CTU Development Board is already connected to the sensors and to the CTU Adapter. Sensors are clipped onto the underside of a clear plastic tray, which also acts as guide rails for aligning targets correctly with sensors.Once the system is working, with the positions of all 4 targets displayed on a PC, parts can be removed for further evaluation and integration.1.2Plug the USB Cable into a PCThe software provided is for Windows XP and Vista. Turn the PC on and plug the USB cable into a convenient port. The Windows Found New Hardware Wizard should launch.1.3Install the Windows Driver for the CTU AdapterThe driver files are on the CD provided. Copy these to a convenient file location on the PC. They may be required later if the Adapter is subsequently connected to a different USB port.In the Windows Found New hardware Wizard, select No, not this time, then click on Next. Tell Windows where to look for the driver files just copied to the PC, and press Next. Windows will now issue a compatibility warning. Press Continue Anyway. After a few seconds the wizard should complete successfully. Press Finish to complete half of the driver installation, and repeat the process a second time to load both parts of the driver.Full details, including screenshots and how to verify installation, are in the CTU Development Applications User Guide.1.4Install the CambridgeIC CTU SoftwareThe CTU Development Applications are on the CD provided. Save these files to an appropriate directory on the target PC. It is recommended to shut all other programs before installation.Locate and launch the setup.exe program from the directory containing the installer. Follow the on-screen prompts to complete the installation. Once completed, the applications require a restart of the PC for correct operation.1.5Launch CTU DemoFrom the PC’s start menu, select All ProgramsÆCambridgeIC CTU SoftwareÆCtuDemo. CTU Demo should run and display the positions of each sensor’s target. Targets are supplied with holders in a bag under the CD. Please see section 2 for how to align them with linear and rotary sensors.For full details of CTU Demo and the other applications provided please refer to the CTU Development Applications User Guide. This also includes a troubleshooting guide in case of difficulties.1.6Scaling Reported Position to Physical UnitsThe CTU Development Applications can display reported position in physical units (mm or degrees). This requires the correct value of the Sin Length parameter to be entered. Free space values are listed below for convenience. Please refer to the sensor’s datasheet for other conditions.Assembled sensor part number Configuration MeasuringLengthNominalTarget GapSin Length013-0006 Rotary 360°1.5mm 360°013-0007 Linear 25mm 37.9mm 013-0008 Linear 50mm 63.0mm 013-0009 Linear 100mm 113.2mm 013-0010 Linear 200mm 213.1mm2Aligning Targets and SensorsThe CTU Development Kit is supplied with 4 targets and holders for the rotary and linear sensors.For best performance, sensors and their targets should be aligned as shown in Figure 1 and Figure 2. Dimensions are in mm. The clear plastic tray maintains a minimum gap of approximately 1mm between the sensor and target. The system will function with gap up to 5mm (an additional 4mm), although resolution will decrease.target holder forlinear sensorsFigure 2 target alignment with linear sensorstarget holder forrotary sensorsPlease refer to sensor datasheets for detailed performance and alignment data.3CTU and Adapter Firmware UpdatesThe CTU Development Applications include UpdateCtuFirmware and UpdateAdapterFirmware which can be used to load new CTU or Adapter firmware files (.cff or .aff) respectively.4PrecautionsTargets are a push fit in the holders supplied with the CTU Development Kit. These holders are not designed for high speed operation. Targets may vibrate free and cause injury.5Next StepsOnce the system’s function has been verified in the CTU Development Kit’s box…•Parts can be removed and evaluated using a customer’s test equipment.•Sensors and targets can be integrated with a customer’s own product.• A customer can develop their own PC applications that communicate with the CTU through the Adapter using…o LabVIEW, based on CambridgeIC example VIs, oro Another .NET programming language, using CambridgeIC’s Class Library and VB sample code.•The CTU Development Board can be connected to the processor of an end product prototype, so that the processor can communicate with the CTU chip over its SPI interface.•The CTU chip can be designed into the product itself.If none of the sensors provided in the CTU Development Kit are appropriate for the end application, please contact CambridgeIC to discuss alternatives.6Kit ContentsThe table below lists the contents of the CTU Development Kit. There first column is the part number for the hardware (if available separately), and the second is the part number of the datasheet (where applicable). Electronic copies of the datasheet are on the CD.Hardware part no Datasheetpart noQty Description013-6003 1 CD with software and documentation013-5006 033-0010 1 CTU Development Board including CAM204 CTU chip013-7001 033-00141 CTUAdapter013-0006 033-0002 1 360° 25mm diameter rotary Type 1 sensor assembly013-0007 033-0004 1 25mm linear Type 1 sensor assembly013-0008 033-0004 1 50mm linear Type 1 sensor assembly013-0009 033-0004 1 100mm linear Type 1 sensor assembly013-0010 033-0004 1 200mm linear Type 1 sensor assembly013-0011 033-0015 1 360° 50mm diameter rotary Type 1 sensor assembly013-0012 033-0016 1 360° 36mm diameter rotary Type 1 sensor assembly013-6001 4 300mm 6-way sensor connecting cable013-6002 1 60mm 14-way SPI interface connecting cable033-0009 033-0009 1 Print-out of the CTU Development Kit User Guide013-1005 033-0005 4 Standard targetsTarget holders, 3 linear and 1 rotaryThe table below lists the contents of the CD supplied with the CTU Development Kit. The CD also includes datasheets for the items listed above. Please contact CambridgeIC for the latest versions.Part number Description021-0001 Windows Adapter Driver021-0002 CambridgeIC.DLL Class Library for communication with a CTU through the Adapter022-0003 CTU LabVIEW VIs023-0001 Visual Basic Sample Code026-0001 CambridgeIC CTU Software Installer033-0003 Datasheet for CAM204 CTU chip033-0006 Class Library User Guide033-0007 CambridgeIC CTU Software User Guide033-0008 CTU LabVIEW VI User Guide033-0012 Resonant Inductive Operating Principle033-0013 End Shaft Sensor Operating Principle7Document HistoryRevision Date ReasonA 24 August 2009 First draft0002 5 November 2009 Added 50mm sensor to hardwareAdded Visual Basic Sample Code to softwareAdded Windows Adapter Driver0003 23 November 2009 Updated Kit Features with extra sensorUpdated introduction0004 4 February 2010 Updated logo and styleAdded further documents to list of CD contents0005 23 July 2010 Updated based on CAM204BE and new sensors0006 16 July 2011 Illustrated new linear target holder designUpdated with additional sensorsReformatted Kit Contents section8Contact InformationCambridge Integrated Circuits Ltd21 Sedley Taylor RoadCambridgeCB2 8PWUKTel: +44 (0) 1223 413500********************9LegalThis document is © 2009-2011 Cambridge Integrated Circuits Ltd (CambridgeIC). It may not be reproduced, in whole or part, either in written or electronic form, without the consent of CambridgeIC. This document is subject to change without notice. It, and the products described in it (“Products”), are supplied on an as-is basis, and no warranty as to their suitability for any particular purpose is either made or implied. CambridgeIC will not accept any claim for damages as a result of the failure of the Products. The Products are not intended for use in medical applications, or other applications where their failure might reasonably be expected to result in personal injury. The publication of this document does not imply any license to use patents or other intellectual property rights.。
Lorex B W Quad Processor 说明书

B/W QUAD PROCESSORPlease be sure to read carefully and follow all the safety information.Keep the manual in a safe place for future reference.www.lorexcctv .comInstruction ManualINTRODUCTIONCongratulations on your purchase of the B/W Quad Processor from Lorex.The B/W Quad Processor turns any monitor into a Quad monitor with security features. Connect up to four cameras to the Quad Splitter and view all four locations simultaneously in Real-Time (30 fps). The Quad Processor is an ideal security solution for small to medium sized commercial enterprises as it allows you to use an existing monitor as a 4 Channel Observation System.Connect the Quad Processor to a Time Lapse VCR or DVR to capture footage and enable alarm recording. You can also connect PIR motion sensors to the Quad Processor in order to have Motion Detection capability. To learn more about this system or to find out more about other products available, please visit our website at FEATURESB/W Quad Processor Features•Turns a slave monitor into a Quad system•View up to 4 camera locations simultaneously in real-time(30 fps)•Four Alarm Inputs for connection of external PIR sensors•Viewing modes: Quad*Full-Screen*Auto Sequencing•Video Loss alarm•Digital freeze frame•On Screen Display: date*time*camera•Event Log lists instances of video loss & alarms•Picture adjustable by channel•720X480 resolution at 256 gray-scale.All the safety and operating instructions should be read before the appliance is operated. The improper operation may cause irreparable damage to the appliance.•Please lift and place this equipment gently.•Do not expose this equipment under to direct sunlight.•Do not use this equipment near water or in contact with water.•Do not spill liquid of any kind on the equipment.•Do not unplug the power connector before turn the power off correctly.•This equipment should be operated using only the power source from standard package.•Unauthorized repair or parts substitutions may result in fire, electric shock or other hazards.•Do not switch the Power On & Off within short period (within 3 seconds).•Do not attempt to service this equipment by yourself. Refer all servicing to qualified service personnel.•This unit should be operated only from the type of power source indicated on the manufacturer’s label.•This installation should conform to all local codes.SAFETY WARNINGCAUTIONRISK OF ELECTRIC SHOCK. DO NOT OPEN.CAUTION: TO REDUCE THE RISK OF ELECTRIC SHOCK, DO NOTREMOVE COVER (OR BACK). NO USER-SERVICEABLE PARTSINSIDE. REFER SERVICING TO QUALIFIED SERVICEPERSONNEL.!Explanation of two SymbolsThe lightning flash with arrowhead symbol, within an equilateral triangle, is intended to alertthe user to the presence of uninsulated"dangerous voltage" within the product's enclosure that may be of sufficient magnitude to constitute a risk of electric shock to persons.The exclamation point within an equilateral triangle is intended to alert the user to the presence of important operating and maintenance-(servicing) instructions in the literature accompanying the appliance.!TABLE OF CONTENTS PAGE1. SYSTEM INCLUDES -----------------------------------------------------------------------------2. FRONT PANEL CONTROLS -------------------------------------------------------------------3. LED INDICATORS -------------------------------------------------------------------------------4. MENU CONTROLS ------------------------------------------------------------------------------5. MENU OPTIONS ----------------------------------------------------------------------------------6. BACK PANEL ---------------------------------------------------------------------------------------7. ALARM AND VIDEO LOSS ---------------------------------------------------------------------8. RS232 REMOTE PROTOCOL -----------------------------------------------------------------9. TROUBLESHOOTING ---------------------------------------------------------------------------10. TECHNICAL SPECIFICATIONS --------------------------------------------------------------11. OPTIONAL ACCESSORIES -------------------------------------------------------------------11. APPENDIX #1 –TYPICAL CONFIGURATION --------------------------------------------12. APPENDIX #2 –CONNECTION TO A PIR MOTION SENSOR ----------------------13. APPENDIX #3 –CONNECTION TO A LOREX TIME LAPSE VCRFOR ALARM RECORDING ----------------------------------------------14. APPENDIX #4 –PIN CONFIGURATIONS FOR CONNECTION TO PC ------------15. LOREX WARRANTY -----------------------------------------------------------------------------4 456 678 89 9 10 11 12 13 14 15SYSTEM INCLUDESB/W Quad SplitterOwner’s Manual12V DC Power SupplyCHECK YOUR PACKAGE TO MAKE SURE THAT YOU RECEIVED THE COMPLETE SYSTEM, INCLUDING THE COMPONENTS SHOWN ABOVE.FRONT PANEL CONTROLSMENU / ESCPressing this button performs the following functions:1) Accesses the Menu screen;2) Confirms selections when editing options in Menu mode;3) Exits the Menu screen./ ( ▲)Pressing this button goes to Full-Screen display of Camera 1. It also serves as the Up arrow key in Menu mode./ ( ▼)Pressing this button goes to Full-Screen display of Camera 2. It also serves as the Down arrow key in Menu mode./ ( ◄)Pressing this button goes to Full-Screen display of Camera 3. It also serves as the Left arrow key in Menu mode./ ( ►)Pressing this button goes to Full-Screen display of Camera 4. It also serves as the Right arrow key in Menu mode.125346781.2.3.4.5.9 Pin Alarm ConnectorQUAD / ( + )Pressing this button goes to Quad display. It is also used to increase values when making selections in Menu mode.AUTO / ( -)Pressing this button initiates automatic sequencing between four all camera locations, in the following screen order:CH1 ÆCH2 ÆCH3 ÆCH4 ÆQUAD ÆCH1 ……..Note: Sequencing will automatically bypass a channel with a video loss.VCR / FREEZE / ENTERThis button performs the following functions:1) Press and hold this button for 3 seconds in order to enter VCR mode.2) For the Freeze feature, press this button once. The screen will change to Quad mode, and the word “FREEZE”will appear in the top-right corner. Press a channel button, and that corresponding channel’s screen will freeze with the letter “Z”appearing as an indicator. You can then continue to freeze and unfreeze individual channels by pressing their respective buttons. To exit Freeze mode, press the VCR PLAY / FREEZE / ENTER button a second time.3) In Menu mode, pressing this button selects a menu option to be edited.6.7.8.LED INDICATORSAbove each button is a red LED indicator, which shows the system’s status and helps to navigate in Menu mode. Below is a summary of what each red LED indicator signifies:125346781) When in Menu mode, this red indicator illuminates.2-5) When in Full Screen viewing, the LED that is ON is the channel being viewed (CH1 / CH2 / CH3 / CH4).When in Freeze mode, the LED’s that are ON are the channels being frozen.In Menu mode, the LED’s that are ON are keys that you need to press in order to navigate through menu options.During an Alarm or Video Loss, the indicator lights blink on the corresponding channels experiencing the Alarm or Video Loss.6)This LED illuminates when in Quad viewing mode.7)This LED illuminates when in Auto Sequencing mode.8)This LED illuminates when Freeze mode is activated, or blinks when in VCR mode.MENU CONTROLSPressing the MENU button brings up the Menu screen.Outlined below are the buttons used for navigation when using the Menu.▲and ▼: Scroll up and down through menu options; change values.◄ and ►: Scroll sideways within a menu option that has been selected+and -: Increase or reduce a value of a menu option when editing it (when it is blinking).ENTER:Selects a submenu / an option in a submenu for browsing / modification MENU: Completes modification of a menu option; exits a menuMENU OPTIONS( MENU )►ALARM DURATION: 02 SEC BAUD RATE: 9600 BPS DWELL TIME: 02 SECINT AUDIBLE ALARM: ON OFF EXT AUDIBLE ALARM: ON OFF SET TIME: YY:MM:DD:HH:MM:SS DWELL SETUP (CH 01): ON OFFALARM POLARITY (CH 01): HIGH LOW OFF CAMERA TITLE SETUP (CH 01): CH1EVENT LIST SYSTEM RESETUpon entering the Menu, you will see the screen shown to the right. Below is a description of each Menu option:Alarm Duration –Set the length of the alarm time between1~99 seconds.Baud Rate –Set the speed of the remote computer’s Baud Ratewhich is the rate of data transmission. Available Baud Rates are: 1200, 2400, 3600, 4800, 9600, 19200, 57600, 115200Dwell Time –Select how long a camera screen will appear in Sequencing mode before switching to the next screen. Programmable between 1~30 secInt Audible Alarm –Activates / deactivates the alarm buzzer function of the Quad Processor.Ext Audible Alarm –Activates / deactivates the Alarm OUT function for external alarm devices.Set Time –Allows you to program the time.Dwell Setup –Activate / deactivate by channel which cameras will be seen in Auto Sequencing.Alarm Polarity –Select the Alarm Input for each PIR motion sensor from LOW, HIGH or OFF per channel.LOW is the equivalent of Normally Open. HIGH is the equivalent of Normally Closed.OFF disables the PIR Alarm feature.The setting is defaulted to LOW.Camera Title Setup –Change the title for each camera that appears in the On Screen Display.Event List –Brings up a historical record of Video Loss and Alarm occurrences. For more information onVideo Loss and Alarm, refer to page 8.System Reset –Returns system to factory default settings.Note : Leave the Alarm Polarity at LOW or OFF for channels with no PIR Sensor connected.Otherwise, you will experience a continuous alarm.BACK PANELPOWER INPUTConnects to the DC 12V Power Adapter.VIDEO IN (Channels 1-4)Four BNC Video camera inputs, for Channels 1-4. Connect four B/W cameras to these inputs.Note: each channel has a knob for adjusting the Gain (contrast and brightness) of the picture.VCR INReceives video from a VCR or DVR.MONITOR OUTTransmits video to a slave monitor (a TV or a security monitor).QUAD OUTTransmits video to a VCR or DVR.RS232 / ALARM INPUT / EXTERNAL I/OConnects to an Alarm Block, which accommodates PIR Motion Sensor connection and enables Alarms.Alternatively, this output can serve as a Remote Protocol for control via a PC.1.2.3.4.5.6.1VIDEO INPUT23500 mADC 12V OUTQUAD VCR 4IN OUT MONITOR ALARM125346G1G2G3G4ALARM AND VIDEO LOSSWhen an Alarm occurs, you will see the icon appear on the channel where motion detection is taking place. If you have set the Internal Audible Alarm to ON, then you will also hear a Buzzer.A Channel that does not have a camera connected will show the icon, indicating a Video Loss. The red LED light above the channel with a Video Loss will blink. If you have set the Internal Audible Alarm to ON, then you will also hear a Buzzer when video is disconnected.The Event List, available from the Menu and shown to the right, lists occurrences of Alarms and Video Loss.RS232 REMOTE PROTOCOLThe RS232 / Alarm input allows you to control the Quad Processor system from your PC. In order toachieve this, you will require an RS232 COM Port communication program installed on your PC. There are many of these software programs available on the internet, some of which are free.The remote connection on the Quad Processor uses 8 data bits, 1 start bit, and 1 stop bit. Below is an example of the data stream with the control codes shown.The PC keyboard simulates the Quad Processor’s keypad. For example, pressing “V”will make the Quad Processor go to VCR mode. Note that the letters are case sensitive. Below is a list of corresponding keys, and their codes:For further instructions on connecting the Quad Processor to your PC’s RS232 port, please refer toAppendix #4.ASCII FUNCTION ASCIIFUNCTIONA AUTO 1CH1Q QUAD 2CH2MMENU3CH3VVCR 4CH4FFREEZECH EVENT YY/MM/DD TIME -----------------------------CH1 VLOSS 01/02/10 12:20:15CH2 ALARM 02/01/15 22:10:25----------/--/----:--:------------/--/----:--:------------/--/----:--:------------/--/----:--:--ACT –OxFF OxCO ID FUNCTION STOP –Ox7F•Connect the video cable on Channel 1 before Power ON. The system will automatically detect electrical settings as PAL or NTSC.Bad picture quality after Power ON•Set the Alarm Polarity according to whether your PIR motion sensor is Normally Open or Normally Closed.•If no motion sensors are connected, set that channel’s Alarm Polarity to OFF.Continuous Alarm•Check the power source cord connections •Check that there is power at the outletNo Power •If you are using a TV as a slave monitor, you may need to set the channel as AUX (or sometimes referred to as “TV / Video”).•Please check if the wiring between the monitor and the Quad Processor is correct.•Make sure that the camera is receiving power Unable to view camerasREMEDYPROBLEMTROUBLESHOOTINGTECHNICAL SPECIFICATIONSBecause our products are subject to continuous improvement SVC reserves the right to modify product designs and specifications without notice and without incurring any obligations. E & O.EVideo input port 4 BNC camera inputs, 1 VCR input Video output port 1 Quad output for VCR, 1 monitor outputAlarm input 4Alarm output 1Alarm duration 1 –99 sec.Camera Title 10 characters Time / Date set Built-in real time clockDwell Time 1 –30 sec.RS-232 port Yes Load impedance 75 Ohms Operating environment 10~80%RH, 0o C~50o C Power source DC 12V ±10%, 500mAPower consumption 6W maximum Dimension (mm)240(W) x 45 (H) x 150 (D)Weight1,300gOPTIONAL ACCESSORIESThe following accessories are available to add to your existing system.DIGITAL VIDEO RECORDERTIME LAPSE VCRRecord images digitally to a hard-disk drive. Record up to 1850 hours (30 hours real time). Use the Quick Search featureto find specific recordings.Used to record key events. Select From a 40 hour real time or 960 Hour time lapse VCRTO ORDER THESE ACCESSORY ITEMS OR FOR A COMPLETE LINE OF ACCESSORIESwww.lorexcctv.comSPECIALTY CAMERASRotates camera up to 270°NIGHTVISIONWeatherproof Night vision accessory. Allows you to see in the dark up to 35-40 distance (for use with Observation system cameras)Protects observation camera From the sunSelect from a wideassortment Of specialtycameras (dome,Weatherproof, bullet, Waterproof, etc.) to suit Individual needsAUTO PAN SUNSHADE HOUSINGNote : If using a TV as a slave monitor, it may have to be set in AUX mode (sometimes referred to as TV/Video)THE B&W QUAD HAS BNC JACKS TO CONNECT TO OTHER VIDEO SOURCES. BNC COUPLERS AND VIDEO CABLES ARE NOT INCLUDED.APPENDIX #1 –TYPICAL CONFIGURATION1VIDEO INPUT23500 mADC 12V OUTQUAD VCR 4IN OUT MONITOR ALARMAPPENDIX #2 –ALARM BLOCK CONNECTIONto PIR MOTION SENSORPIR Motion Sensors9 PIN Alarm ConnectorWires from the PIR Motion Sensors must be soldered to the contacts of the 9 Pin Alarm Connector.Each PIR Sensor connects via 2 wires: 1 to a specific Channel and the other to the Ground input.For descriptions of each contact on the 9 Pin Connector of the Quad Processor, please refer to Appendix #4.2378APPENDIX #3 –CONNECTION to a LOREX TIME LAPSE VCR FOR ALARM RECORDINGOUTQUAD VCR IN OUT MONITOR ALARMWires being run from the Time Lapse VCR’s or DVR’s alarm terminal must be soldered to the appropriate contacts of the 9 Pin Alarm Connector.In addition to connecting up to 4 PIR Motion Sensors, connect the 9 Pin Alarm Connector to a security recorder such as the Lorex Time Lapse VCR shown below for Alarm Recording upon motion detection.For descriptions of each contact on the 9 Pin Connector of the Quad Processor, please refer to Appendix #4.4 5APPENDIX #4 –PIN CONFIGURATIONS FOR CONNECTION TO A COMPUTER25 PIN COM PORT9 PIN COM PORT。
Linux下监控CPU和GPU温度的三款命令行工具

Linux下监控CPU和GPU温度的三款命令⾏⼯具 如今,即使技术已经⽇新⽉异,但是笔记本电脑的散热还是⼀个常见问题。
监视硬件温度可以帮助您诊断笔记本电脑过热的原因。
在本⽂中,我们将分享⼀些有⽤的命令⾏⼯具,以帮助您密切关注CPU和GPU的温度。
1、sensor sensor是⼀个简单的命令⾏实⽤程序,它显⽰包括CPU在内的所有传感器芯⽚的当前读数。
默认情况下,它预装在⼀些Linux发⾏版种,如Ubuntu,如果没有则按如下所⽰安装。
[linuxidc@linux:~/]$ sudo apt-get install lm-sensors 然后,您可以运⾏以下命令来检测系统上的所有传感器。
[linuxidc@linux:~/]$ sudo sensors-detect 输出⽰例如下图:⼀旦检测到,您就可以运⾏以下命令来检查CPU温度,GPU温度,风扇速度,电压等。
[linuxidc@linuxidc:~/]$ sensors⽰例输出:dell_smm-virtual-0Adapter: Virtual deviceProcessor Fan: 2515 RPMCPU: +55.0°CAmbient: +38.0°CSODIMM: +42.0°C coretemp-isa-0000Adapter: ISA adapterPackage id 0: +56.0°C (high = +100.0°C, crit = +100.0°C)Core 0: +54.0°C (high = +100.0°C, crit = +100.0°C)Core 1: +52.0°C (high = +100.0°C, crit = +100.0°C)Core 2: +56.0°C (high = +100.0°C, crit = +100.0°C)Core 3: +51.0°C (high = +100.0°C, crit = +100.0°C)2、Glances Glances是使⽤Python编写的基于跨平台curses的系统监视⼯具。
ARM Cortex-M7处理器在传感器融合中说明书

ARM Cortex-M7 Processor in Sensor FusionD. M aidment – A RM M obile S egment M anagerIan J ohnson – A RM P roduct M anagerPramod R amarao – H illcrest L absJuly 2015Copyright © 2015 ARM Limited. All rights reserved.IntroductionEmbedding sensor technology into products has been a very strong trend in recent years. From smartphones, fitness bands, gaming controllers, smartwatches, head mounted displays the list of consumer devices with sensors is rapidly expanding. Traditionally we have seen sensors used to track location and movement (e.g. GPS, accelerometers, gyroscopes, and magnetometers) but this too i s q uickly s hifting t o a r aft o f n ew a dvanced s ensors s uch a s b io-‐medical, a udio a nd v isual.The more sensors we embed, the more data that is correspondingly generated. This data is useful and d rives a pplications s uch a s f itness a nd h ealth t racking a s w ell a s t he r ecent a dvances i n v irtual reality headsets. As we generate more data, we also consume more energy making ‘sense of sensors’ in more and more sophisticated applications. This is where sensor fusion comes in. Sensor fusion is the smart combining and interpretation of disparate sensor feeds thus giving the application a f ar g reater i nsight i nto a u ser’s b ehaviour o r m ovement.The ARM® Cortex®-‐M processor family is widely used in sensor fusion applications and can be found in many applications today. From the ultra-‐low power Cortex-‐M0+ core through to the high performance C ortex-‐M7 c ore, t he C ortex-‐M f amily o f p rocessors o ffers a w ide r ange o f p erformance points t o s uit v aried a pplications.Hillcrest Labs is the leading provider of software solutions for sensor-‐enabled products. More than a decade of research and development has led to a portfolio of unique IP, which transforms sensor data into contextual information for use in a variety of consumer electronic devices and applications. Hillcrest’s sensor fusion and processing technology used by many of the world’s leading CE brands are in millions of consumers’ homes, offices, pockets, and hands around the world.This p aper s ets o ut t o e xplain t he b ackground o f s ensor f usion p rocessing a nd t o p resent t o t he reader t he a dvances i n d evice c apability t hat w ill b e e nabled a s a r esult o f C ortex-‐M7 p rocessor. A s well a s e xplaining t he b enefits o f t he C ortex-‐M7 a rchitecture i n s ensor f usion, t his p aper a lso g oes on t o p resent t wo a pplication e xamples, t he f irst s howing h ow s ensor f usion o n a C ortex-‐M7 processor c an b e u sed a s a n o ffload e ngine t o s ave e nergy i n a h igh-‐end h ead m ounted d isplay application. I n c ontrast, t he s econd a pplication e xample t akes t he r eader t hrough t he b enefits o f using a C ortex-‐M7 c ore i n a s tandalone c onfiguration t o r un b oth t he s ensor f usion s oftware a s w ell as t he m ain a pplication s oftware i n a s martwatch a pplication.Copyright © 2015 ARM Limited. All rights reserved.Copyright © 2015 ARM Limited. All rights reserved.An I ntroduction t o t he A RM C ortex-‐M7 P rocessorThe C ortex-‐M7 p rocessor i s t he l atest m ember o f t he e nergy-‐efficient C ortex-‐M f amily o fprocessors, w hich d eliver 32-‐bit p erformance, t ogether w ith v ery f ast, d eterministic h andling o f interrupts. T his c ombination m akes t he f amily i deal f or u se i n e mbedded a pplications r equiring high p erformance a nd r eal-‐time r esponse.The C ortex-‐M7 p rocessor h as t he s ame C ortex-‐M p rogrammers’ m odel a s t he C ortex-‐M3 a ndCortex-‐M4 (code w hich r uns o n C ortex-‐M3 a nd C ortex-‐M4 w ill r un u nchanged o n C ortex-‐M7), b ut i t introduces m any n ew o ptional f eatures i n t erms o f m emory a nd s ystem i nterfaces a nd h as a powerful n ew m icroarchitecture.The r elationship b etween t he i nstruction s ets o f t he C ortex-‐M p rocessor f amily i s s hown i n t he diagram b elow:Figure 1 -‐ C ortex-‐M F amily o f P rocessors I nstruction SetCopyright © 2015 ARM Limited. All rights reserved.The k ey c haracteristics o f t he C ortex-‐M7 p rocessor i nclude:− High p erformance s ix s tage s uperscalar p ipeline, w ith b ranch p rediction− Powerful i nstruction s et w ith S IMD, s aturating a rithmetic, s ingle c ycle M AC f or e fficient D SP − Optional 64-‐bit I nstruction T ightly C oupled M emory (I-‐TCM), a nd o ptional 2x32-‐bit D ata TCM (D-‐TCM), w ith s upport f or c ustom E rror C orrection C ode (ECC) i mplementation f oreach o f t he T CM i nterfaces− 64-‐bit A MBA® 4 A XI b us i nterface f or a ccess t o m emory a nd s lower o r m ore c omplex peripherals− Optional i nstruction c ache (from 4kB t o 64kB) a nd d ata c ache (from 4kB t o 64kB), w ith optional E CC s upport f or e ach o f t he c ache m emories− Optional l ow-‐latency A HB p eripheral b us i nterface (referred t o a s A HBP)− AHB s lave i nterface (AHBS) t o a llow D MA a ccess t o t he T CMs− Integrated N ested V ectored I nterrupt C ontroller (NVIC) w ith 1 t o 240 i nterrupts, w ith 3 t o 8-‐bit p rogrammable p riority l evel r egisters− Optional M emory P rotection U nit (MPU) w ith 8 o r 16 r egions− Optional F loating P oint U nit (FPU) w ith s upport f or s ingle-‐ a nd d ouble-‐precision I EEE-‐754 floating p oint i nstructions− Powerful d ebug f eatures, w ith o ptional f ull i nstruction a nd d ata t raceFigure 2 -‐ C ortex-‐M7 P rocessor B lock DiagramCopyright © 2015 ARM Limited. All rights reserved.An i ntroduction t o S ensor F usionSensors a re v ital t o t he u ser e xperience i n c onsumer e lectronics t oday. F rom y our p hone k nowing which w ay y ou a re f acing o n a m ap t o y our s martwatch a uto-‐logging y our a ctivity a nd s leep 24/7, sensors a re a t t he h eart o f h ow w e i nteract w ith o ur d evices. H owever, t he q uality o f t he u serexperience i s l argely d riven n ot b y s ensors t hemselves b ut b y t he s ensor f usion a lgorithms, w hich turn s ensor d ata i nto u seful, a pplication-‐ready i nformation.Figure 3 -‐ S ensor H ub S ystem A rchitectureSensors a re s mall, n oisy, a nd t heir s ignals a re e asily d istorted a nd s usceptible t o i nterference; sensor f usion a nd p rocessing s oftware a dds c alibration, f usion, a nd m uch m ore t o m ake t he d ata more a ccurate, r eliable a nd r eady t o b e e xposed t o r eal-‐world a pplications. S ensor f usion i tself i s not s imple, a nd c an b e c ompared t o a n i ceberg – t he ‘visible’ s ensor f usion i s a s mall, r elatively simple s et o f a lgorithms. H owever, t hose a lgorithms r ely o n a h idden a nd c omplicated w orld o f larger s ystems c hallenges, w hich m ust b e a ddressed t o p rovide h igh q uality d ata t o t he f usion system. W hen g ood s ensor f usion i s i ntegrated c orrectly i nto t he s ensor s ystem, i t c an h ave dramatic i mpacts o n t he u ser e xperience.The u ser e xperience b enefits o f g ood s ensor f usion c an b roadly b e a ssigned t o t wo c ategories:Enabling N ew A pplications:Sensor f usion a nd p rocessing p rovide u nique i nformation o n t he d evice, u ser, a nd e nvironment, which e nables n ew a pplications a nd m ore p ersonalized c omputing. E xamples t oday i nclude a ctivity trackers, w hich m onitor y our s teps a nd d aily a ctivity t o e ncourage a h ealthier l ifestyle, a nd g esture recognition w hich a ct a s a n i nterface t o e liminate d ozens o f s creen t aps. S oon s ensor f usion w ill enable y our p hone t o g uide y ou t o a n i ndoor p latform a t t he t rain s tation w ithout s atellites, a nd track h ead m ovement t o p ower v irtual r eality h eadsets. B eyond t hese a pplication e xamples, contextual c omputing c an e nable y our p hone o r w earable t o d eliver u seful i nformation b efore y ou even a sk f or i t.Saving P ower:Sensor f usion a nd p rocessing c an a lso h elp c onserve p ower b ased o n d evice c ontext. A s a n e xample, if t he p hone i s s itting o n a d esk i n y our o ffice a nd h as n ot m oved i n s everal h ours, t he p hone d oes not h ave t o s ample t he G PS o r o therwise c alculate l ocation. S imilar t echniques c an b e u sed t o automatically m anage p hone f unctions w hile y ou a re i n c ars o r o n p ublic t ransport. W hile t hese may s eem l ike s mall s teps, t he a ssociated p ower s avings c an r eally a dd u p.To e nable t hese u ser e xperience b enefits w e n eed t he s ensors t o b e ‘always-‐on’ a nd g athering d ata regardless o f w hether t he d evice i s a ctively b eing u sed. T hat m eans w e n eed t o h ave a w ay o f gathering, f iltering, a nd a nalyzing t he d ata f rom s ensors w ithout c onsuming s ignificant a mounts o f the p hone’s b attery o r p rocessing r esources. T his h as l ed t o t he r ise o f a t ype o f p rocessor k nown a s a “Sensor H ub”. A s ensor h ub i s a d edicated p rocessor, t ypically b ased o n t he A RM C ortex-‐M processor s eries a rchitecture, w hich h andles s ensor p rocessing. B y o ptimizing t he p rocessor, sensor f usion a nd p rocessing s oftware, w e c an e nable t he b enefits o f a lways-‐on p rocessing w ith minimal i mpact o n d evice b attery l ife.The C ortex-‐M7 P rocessor ‘High R esolution S ensor F usion’ The C ortex-‐M7 p rocessor b rings a n umber o f a rchitectural e nhancements t hat b enefit s ensor f usion algorithms. T he u nique c haracteristics o f t he C ortex-‐M7 c ore a llow a m ore e fficient e xecution o f sensor f usion a lgorithms. T his i n t urn r esults i n i mproved l atency a nd o verall l ower s ystem p ower (thereby e xtending b attery l ife).Copyright © 2015 ARM Limited. All rights reserved.Specific a rchitectural f eatures o f t he C ortex-‐M7 p rocessor t hat l end t hemselves t o s ensor f usion include:-‐A s uperscalar a rchitecture e nabling a n i ncrease i n p erformance-‐Single c ycle M ACs, r esulting i n f ewer i nstructions t o d o m ore m ath p rocessing-‐SIMD c apability, t o s peed u p t he c omplex c alculations r equired i n s ensor f usion (e.g.quaternion m ultiplications)-‐Efficient a ccess t o o n-‐chip R AM h elps w ith c ontext c lassifiers a nd c haining o f m ultiple classifiers f or b etter c ontext. L ocal o n-‐chip R AM a ccess a lso s aves e nergy v s. o ff-‐chip D DR accesses a nd a s s uch i ncreases o verall b attery e fficiency-‐When i ntegrating t he s ensor h ub i nto a w ider S OC, t he u se o f a c ache a llows e fficient sharing o f m emories a nd m inimizes o ff-‐chip a ccesses, h ence s aves e nergy a nd i ncreasesoverall p erformanceCopyright © 2015 ARM Limited. All rights reserved.Copyright © 2015 ARM Limited. All rights reserved.Application E xample #1: S ensor F usion O ffloadFigure 4 -‐ H igh-‐end W earable E xample S ystem A rchitectureVirtual r eality (VR) s ystems r ely o n t ricking t he b rain i nto b elieving t he v irtual w orld i s r eal. T hat means t hat i t i s v ital f or t he s ystem t o t ranslate r eal-‐world a ctions i nto t he v irtual w orld w ith t he greatest p recision a nd t he l owest l atency p ossible. A c ommonly u sed a rchitecture o f a m odernwearable d evice u ses a C ortex-‐A p rocessor t o r un a r ich O S p roviding a s ophisticated u ser i nterface, while o ffloading t he s ensor f usion f unction, w hich r equires d eterministic r eal-‐time r esponse, t o a Cortex-‐M p rocessor. A s s ensor d ata p rocessing r equirements g row, t he C ortex-‐M7 i s a n i deal processor f or t his f unction. T he p rocessing p ower o f t he C ortex-‐M7 c ore p rovides t he p erfect foundation f or t he s ensor f usion a nd p rocessing t o m eet p erformance r equirements o f g ood h ead tracking s olutions u sed i n V R s ystems.Take l atency a s a n e xample, w hich i s w idely c onsidered a p rimary c ause o f ‘simulator s ickness’. Latency i s t he t ime b etween h ead m ovement a nd t he a djustment o f t he i mage, w hich c orresponds to t hat m ovement. M any s ystem f actors c ontribute t o l atency, b ut g athering, p rocessing, a nd delivering s ensor d ata t o t he s ystem i s a n otable o ne.The C ortex-‐M7 p rocessor e nables h igh-‐resolution s ensor s ampling a nd s ensor f usion i ncluding dynamic c alibration o f s ensors. T ypical s ensor f usion o utput d ata r ates (ODR) u sed i n mainstreamhead t rackers t oday a re i n t he o rder o f a f ew 100Hz, b ut t he e xtra p rocessing c apability o f t he Cortex-‐M7core a llows t hat t o s cale u pwards o f 1kHz. P rimarily t his i ncreased O DR m eans t here i s minimal d elay w hen g athering d ata p ackets a t a n a ppropriate t ime f or t he g raphics r endering, a s the v ideo f rame r ate i s d ifferent t o t he s ensor f usion-‐processing r ate. A dditionally, i t e nables a denser s ample f or m ore a ccurate p redictive h ead t racking. B y a nalyzing p atterns a nd p redicting future m ovement, l atency c an b e r educed, b ut t he d ensity o f d ata a vailable o ver t he c ourse o f a f ew milliseconds i s v ital t o p erformance o f h ead t racking. T he f urther w e h ave t o l ook i nto t he p ast t o obtain a n a ppropriate b ody o f d ata t o u se i n t he p redictions, t he l ess r eliable t he e stimate b ecomes. Higher O DR (1 k Hz o r m ore) i ncreases t he a ccuracy o f t he p rediction a nd t herefore t he q uality o f the u ser e xperience.The f uture o f v irtual r eality i s n ot o nly i n t he b est h ead t racking p ossible, i t i s a lso i n f ull b ody interactions, a nd c ontrols w ith v oice a nd o ther n atural i nteraction m ethods. E ven w ith h igh O DR sensor f usion p rocessing, t he C ortex-‐M7 p rocessor w ould s till h ave a dditional p rocessing p ower t o support t hese a dditional f unctions. F or e xample, i f a dditional s ensors t racked m ovement o f o ther parts o f t he b ody t hrough a n etwork o f b ody-‐worn s ensors, t he C ortex-‐M7 p rocessor c ould f use t hat data t ogether t o t rack f ull-‐body o rientation c hanges. A dditionally, i f v oice c ontrols a nd t he recognition o f s pecific c ommand k eywords w ere a lso a dded, t he C ortex-‐M7 w ould h ave e nough processing p ower f or t hese a dditional a pplications.Copyright © 2015 ARM Limited. All rights reserved.Copyright © 2015 ARM Limited. All rights reserved.Application E xample #2: S ensor F usion S tandaloneFigure 5 -‐ S ingle C ortex-‐M7 P rocessor E xample S ystemWearables a re a h otbed f or s ensor a doption. N owhere i s t his m ore s o t han i n w rist-‐worn w earables, where u sing s ensors t o t rack u ser a ctivity h ave b ecome c ommonplace. C urrent t rends s how m ore sophisticated, m ultipurpose d evices s uch a s s martwatches g aining m arket s hare a t t he e xpense o f simpler d evices s uch a s a ctivity t racking w ristbands. T hese d evices f eature m ore s ensors o ften incorporating p ressure, h eart r ate, g yroscopes, a nd m ore t o p rovide a dditional d ata t o t he u ser a nd to e nable b etter u ser i nterfaces.This t rend i s o nly g oing t o c ontinue a s m ore s ensors b ecome a vailable. A dditional m otion s ensors including g yroscopes a nd m agnetometers w ill h elp a dd r ichness a nd a ccuracy t o p ersonal c ontext tracking. E nvironmental s ensors, s uch a s U V l ight, h umidity, a nd t emperature, w ill e nable b etter user c ontext a nd e nhanced p ersonal c omfort. B iological s ensors w ill m easure h ydration, b lood oxygen a nd g lucose s aturation, s kin t emperature a nd s weat, a nd m ore t o p rovide u nique i nsights about t he u ser’s b ody a nd h ealth.Combining d ata f rom t his e xpanding a rray o f s ensors w ill r equire a p owerful y et p ower e fficient processor. T his w ill b e p articularly i mportant f or l ow-‐power c ontext c lassification. A dvanced context d etection r equires c omplex a lgorithms a nd t hese a lgorithms c an t ake a dvantage o f t he advanced f eatures o f t he C ortex-‐M7 p rocessor t o p rovide a ccurate y et l ow-‐power c ontext d etection for r ich u ser a pplications.Particularly u seful h ere i s t he s uitability o f t he C ortex-‐M7 c ore f or a udio p rocessing. W ith l imited interfaces, v oice i s a p rimary n atural i nteraction m ethod f or w earables, s o k eyword r ecognition w ill be v ital. I n a ddition, a r ich p icture o f c ontext c an b e g athered f rom a udio s ignatures. F or e xample, detecting w hether a u ser i s i n a c ar, o n a b us o r o n a t rain c an b e d ifficult t hrough m otion a nd environmental s ensors a lone. H owever, d istinctive a udio s ignatures c an g reatly i ncrease t he context d etection r eliability.Another a pplication, w hich w ill m ake f ull u se o f t he C ortex-‐M7 p rocessor’s f eatures, i s s martwatch-‐based p edestrian d ead r eckoning (PDR). W ith l ocation-‐based s ervices b ecoming m ore i mportant, and s ensors i n s martwatches b ecoming m ore s ophisticated, P DR w ill h ave a n e ssential r ole i n a ny low-‐power n avigation a pplication. H owever, P DR p laces a p remium o n t he a ccuracy o f s ensor d ata, making t he s ample r ate a nd e fficient d ynamic s ensor c alibration b oth v itally i mportant. T he C ortex-‐M7 p rocessor i s a n i deal p latform t o s upport h igh s ample r ates a nd f or a c oncurrent r eal-‐time calibration o f s everal s ensors t o i ncrease t he a ccuracy o f t he P DR o utput.A f ull n avigation s olution w ill f use t he a ccurate P DR o utput w ith e xternal r eference s ources s uch a s GNSS o r W i-‐Fi/beacons a nd m ap m atching t o i ncrease t he s tability a nd a ccuracy o f t he n avigation. The a dditional r esources o f t he C ortex-‐M7 c ore m ake i t u niquely a ble t o u nite t hese d isparate d ata sources f or c omplete a nd a ccurate n avigation i n a p ower-‐efficient m anner.Today m ost a dvanced w earables m imic t he a rchitecture o f a s martphone, u sing a C ortex-‐M processor-‐based s ensor h ub i n c onjunction w ith a C ortex-‐A s eries a pplication p rocessor. H owever, the C ortex-‐M7 p rocessor i s s uitably p owerful s o t hat i n m any c ases, e ven a fter c ompleting t he processing o f d ata f rom n umerous s ensors, a s d escribed a bove, i t w ill s till h ave r emaining c ycles f or display m anagement a nd t he o ther i mportant s martwatch f unctions. T herefore, f or m any w earable devices, t he p ower o f t he C ortex-‐M7 p rocessor w ill n egate t he n eed f or a t raditional a pplication processor w hile e xtending b attery l ife a nd t ime b etween c harges, a lleviating o ne o f t he p rimary design c hallenges s urrounding s martwatches t oday.Copyright © 2015 ARM Limited. All rights reserved.SummaryThe r ole o f s ensors i s b ecoming e ver i mportant p roviding t he a bility t o g ive u nique i nsights i nto o ur lives a nd b ehaviours. F rom s ports a nd f itness a ctivity t racking t o q uantified s elf-‐medical t racking of p arameters l ike h eart r ate a nd b lood p ressure, w e s ee a c onstant n eed f or m ore a ccurate a nd reliable s ensor-‐based d evices. I n t his p aper, A RM a nd H illcrest L abs h ave o utlined t he k ey a reas o f consideration t o d evelopers i n d esigning s ensor-‐based s ystems a nd i n t urn a llowing m ore a ccurate and i nsightful a pplications t o b e b uilt. T he A RM C ortex-‐M7 r epresents a s ignificant u plift i n processing c apability a llowing m ore s ophisticated s ensor f usion a lgorithms t o b e d eployed i nto advanced p roducts w hilst r etaining t he l ow p ower c haracteristics e ssential f or t odays a dvanced‘always-‐on, a lways a ware’ p roducts.Copyright © 2015 ARM Limited. All rights reserved.。
汽车转向系统英文文献

S¯a dhan¯a V ol.33,Part5,October2008,pp.581–590.©Printed in IndiaDSP-based electric power assisted steering using BLDC motorR MURUGAN,S NANDAKUMAR and M S MOHIYADEENBharat Electronics Limited,Nandambakkam,Chennai600089e-mail:muruganr@bel.co.in;nandakumars@bel.co.in;mohiyadeenms@bel.co.inAbstract.This paper introduces a design and implementation of electricallyassisted power steering(EAS)using BLDC motor for a vehicle.The control archi-tecture consists of two layers of control,namely the vehicle speed associated controland the torque assist control.In the higher level of control architecture,the vehiclespeed controller works as an assistance level controller for the steering effort.Inthe lower level,the torque controller gives the effort level control.This has beenrealized by torque sensor and vehicle sensor interfaced in the DSP.For implement-ing in the system,a DSP-based BLDC motor controller with three-phase invertermodule is specially designed using Hall-effect sensor feedback and a single dc-linkcurrent sensor.This work is implemented in a Light Commercial Vehicle havinga recirculating ball type gear.This is for thefirst time(EAS)being implementedfor this type of vehicle any where in the world.Generally,EAS having clutch todisconnect the motor in high speed or abnormal conditions from the gear box.Inthis implementation the motor is directly coupled to gearbox without clutch and allabnormalities are handled by the processor.This is implemented without modify-ing the vehicle supply system like changing the existing alternator or rating of thebattery and using the existing sensors.The design is such a way that the feel of thedriver assistance can be varied easily at any time.The performance of the controlsystem is experimentally verified and it is tested in one of the Light CommercialVehicle(LCV).Keywords.BLDC motor;EAS;steering.1.IntroductionPower steering is a system for reducing the steering effort on vehicles by using external source to assist in turning the wheels.Most new generation vehicles now have power steering, owing to the trends toward greater vehicle mass and wider tires,all increase the steering effort needed.Modern vehicles would be difficult to maneuver at low speeds(e.g.when parking) without assistance.Most power steering systems work by using a belt-driven pump to provide hydraulic pressure to the system.This hydraulic pressure is generated by a pump which is driven by the vehicle’s engine.While the power steering is not used,i.e.driving in a straight line,twin hydraulic lines provide equal pressure to both sides of the steering wheel gear.581582R Murugan,S Nandakumar and M S MohiyadeenWhen torque is applied to the steering wheel,the hydraulic lines provide unequal pressures and hence assist in turning the wheels in the intended direction.Electric Power Steering systems use electric components with no hydraulic systems at all.Sensors detect the motion and torque of the steering column and a computer module applies assistive power via an electric motor coupled directly to either the steering gear or steering column.This allows varying amounts of assistance to be applied depending on driving conditions.In the event of component failure,a mechanical linkage such as a rack and pinion serves as a back-up in a manner similar to that of hydraulic systems.Electric systems have an advantage in fuel efficiency because there is no hydraulic pump constantly running. Their other big advantage is the elimination of a belt-driven engine accessory,and several high-pressure hydraulic hoses between the hydraulic pump,mounted on the engine,and the steering gear,mounted on the chassis.This greatly simplifies manufacturing.The demand of electrically assisted power steering(EAS)has rapidly increased in past few years because of energy savings compared to Hydraulic Power Steering(HPS).Alternating current(ac)motors are designed to be highly efficient and easily controlled with modern power circuitry.Because of the developments in switching techniques,it is quite feasible to use ac motors with a battery supply as source.The traditional worm gear driven dc motor system is constrained by the limitations of the dc motor brushes and size of the motor for the same torque of BLDC.In this case BLDC motor has been used as an actuator in the application for electric power steering.The BLDC motor provides high torque and easy control(Chan &Fang2002;Chu et al2001;Desai&Emadi2005;Jun-Uk Chu et al2004;Kevin Brown et al1990;NamhunKim et al2007).The basic mechanical properties of the vehicle are essentially invariant among all of the available brands.The electrically assisted power steering system consists of BLDC motor mounted to the frame of the steering column and coupled to the wheels through a worm speed reducer.Electrically assisted power steering is shown in figure1.An electrically assisted power steering is composed of several parts such as torque sensor, engine speed sensor,vehicle speed sensor,steering column,torsion bar and electronic control unit.Figure1.Electrically assisted power steering.DSP-based electric power assisted steering using BLDC motor583 Torque sensor output gives the torque difference to be developed by the motor to reduce the effort required by the driver while he is steering.Engine speed signal is required to start the assistance only when the engine is ON in order to save the battery life.Vehicle speed signal is required to control the assistance developed by the motor(for the same level of torque signal)at various vehicle speed,as assistance requirement comes down as speed of the vehicle increases.The control architecture consists of two layers of control,namely the assistance level control and the torque control.In the higher level of control architecture,vehicle speed signal works as a reference for controlling the assistance to be developed by the motor.In the inner layer torque sensor signal performs generation of torque.The torque output from motor is a function of torque sensor signal and it depends on the torque difference between the steering wheel and the wheel.The vehicle speed signal and engine speed signals are pulses with variable frequency.For system implementation,a DSP-based BLDC motor controller with three-phase inverter module is specially designed using Hall-effect sensor feedback and a single dc-link current sensor.The torque and Back EMF equations of BLDC motor are similar to that of dc motor.The current sensing is ensured by a low cost shunt resistor and used for over-current protection and current feedback.The current control is achieved by PID controller and pulse width modulation(PWM) signals with varying duty rates.Hall-effect sensors are available to detect rotor shaft position, used for electronic commutation,motor speed and direction of rotation.2.Hardware architectureA block diagram of the power assisted steering is illustrated infigure2.The electrically assisted power steering system in a vehicle consists of the following parts.a.Digital signal processorb.Driver and protection cardc.Three phase inverterd.BLDC motor with Hall sensore.Reduction gear and sensors.Figure2.Block diagram of EAS.584R Murugan,S Nandakumar and M S MohiyadeenFigure3.DSP and protection card.2.1ProcessorThe DSP used for control and computation is TMS320F24XX.The processor is a single chip solution based on40MIPS,16bitfixed point DSP core with several associated peripherals such as Pulse Width Modulation generator(PWM)and Analog to Digital Converter(ADC) BPRA0551997;SPRU160C1999;SPRU161C1999.2.2Driver and protection circuitThe selected MOSFET Driver is from IR family.The PWM signals coming from the DSP are combined with protection logics and connected to MOSFET driver.The output of the driver is directly connected to the MOSFET switches through series gate resistor.The current sensing is done by the low cost shunt resistor.The voltage drop is processed with analog amplifier and connected to ADC module and used for current feedback and over-current protection.The protection card used here is shown in thefigure3.2.3Three phase inverter moduleThe three phase inverter module is developed by using MOSFETs with low ON state drop and high switching frequency.The three-phase inverter card used is shown infigure4.Figure4.MOSFET card.DSP-based electric power assisted steering using BLDC motor585Figure5.BLDC motor equivalent circuit.2.4BLDC motor with Hall sensorThe equivalent circuit of a BLDC motor is shown infigure5.The BLDC motor used here has8magnetic pole pairs on the rotor and a three-phase star connected windings on stator. The voltage equation of BLDC motor can be represented as⎡⎣V aV bV c⎤⎦=⎡⎣R000R000R⎤⎦⎡⎣i ai bi c⎤⎦+⎡⎣L000L000L⎤⎦ddt⎡⎣i ai bi c⎤⎦+⎡⎣e ae be c⎤⎦(1)R=Phase resistanceL=Phase inductanceV a,V b,V c=Phase voltagesI a,i b,i c=Phase currentse a,e b,e c=Back EMFs.The generated motor torque is given byT=e a i a+e b i b+e c i cω,(2)whereωis motor angular velocity.The motor is equipped with three Hall effect sensors.The Hall sensors produce three180◦(electrical)overlapping signals as shown in thefigure6.Thus it is providing six mandatory commutation points.The Hall sensor outputs are directly connected to processor and it generate the necessary switching sequence as per commutation.2.5Gear box and sensing circuitsThe BLDC motor is connected to a reduction gear system as shown infigure7.It drives the wheel.The torque difference between the steering wheel and wheel is sensed by a torsion bar.The output of the torsion bar is sensed by the torque sensor.The output of the torque sensor is directly connected to ADC for processing.586R Murugan,S Nandakumar and M S MohiyadeenFigure6.Hall sensor wave form.3.Controller design3.1Effort level controlThe electrically assisted power steering(EAS)incorporates a brushless electric motor located on the steering column,on the pinion that assists the driver when rmation like engine speed,and torque required are transmitted in real time to a DSP which deter-mines the optimal degree of assistance the electric motor should apply.Figure8shows the effort required by the driver without assistance and with assistance for a vehicle at static.Electrically assisted power steering eliminates the need for hydraulicfluids and complicated mechanical components(such as servo pumps),hydraulic lines,belts and pulleys,which add weight and volume.By eliminating the hydraulic pump,the EAS can operate without the help of the engine.Unlike a conventional hydraulic system,the EAS consumes energy only when providing assistance.The control algorithm for the electrically assisted steering system is shown infigure9.The effective torque and velocity control of a BLDC motor is based on relatively simple torque and Back EMF equations,which are similar to those of the DC motor.Figure7.Gear box with motor.DSP-based electric power assisted steering using BLDC motor587Figure8.Effort curve. During any120degree interval of phase current,I the instantaneous power(P)being converted from electrical to mechanical isP=ωT e=2EI(3)T e=Electromagnetic torqueE=Induced EMF per phase.The‘2’in this equation arises from the fact that two-phase are conducting.E=2NphB g Lrω,per phase induced emf.(4)Nph=Number of winding turns per phaseB g=Rotor magneticfield densityL=Length of the rotorr=Internal radius of rotor.Figure9.Control algorithms.588R Murugan,S Nandakumar and M S MohiyadeenUsing the above expression the electromagnetic torque is given by,T e=4NphBgLrI=KφI(5)K=Torque constantφ=Flux per pole pair.The system takes torque reference(I−ref)and feedback line current(Ifb)as input,produces duty-cycle reference as output.This is actually a PI controller.The following equation is implementedD−cycle=K p(I−ref−If b)+K pT i(I−ref−If b)dt,(6)Kp=Proportional constantT i=Time constant.Limiters are there atfinal controller output.Duty cycle reference is clamped to the peak of the saw tooth carrier wave.Current control is achieved by Pulse Width Modulation(fixed frequency20kHz)signals with varying duty cycles.PWM width is determined by comparing the measured actual current with the desired reference current.To sum up,the Back EMF is directly proportional to the motor velocity and the torque production is almost directly proportional to the phase current.In this control scheme,torque production follows the principle that current shouldflow in only two of the three phases at a time.Only one current at a time needs to be controlled so that only one current sensor is necessary.The positioning of the current sensor allows the use of a low cost resistor as a shunt.3.2Assistance level controlFigure10shows the effort required to be produced by the motor for various vehicle speeds.Variable steering assistance(higher at low vehicle speed and lower at high vehiclespeed),Figure10.Boost curve for various speeds.DSP-based electric power assisted steering using BLDC motor589Figure11.Driving effort outputfrom EAS.which improves drivability and active safety.This has been implemented by sensing the vehicle speed and accordingly modifies the effort to be produced by the electric motor by controlling reference to the controller.4.Experimental resultsIn this section,the result is presented(figure11)to ensure the validity of the proposed method at static driving.From the abovefigure,we can see that the effort required by the driver is almost constant entire steering wheel rotation.The effort reduction comes around75%.The motor is selected such that the cogging torque is very less.The maximum peak cogging torque of the motor used at10rpm is0·0056Nm compared to peak torque of2·45Nm.The acceleration and deceleration of the motor is done in such a way that the driver does not feel the torque ripple in his hand.The torque ripple generally felt at low speed,here the system in a loop such that the system is always in acceleration/deceleration phase,so feel of torque ripple is less.Further to above,the mechanical system itself is in variable gear ratio and it has inherent torque variation more than the motor torque ripple produced by the motor.Hence the driver is not able to feel the torque ripple compared with EAS ON mode and EAS OFF mode.From this result,it is seen that the proposed EAS has performed as expected.Maximum torque required(manual):32NmTorque required during power assistance:8NmPercentage assistance provided:75%Average current consumption:8A5.ConclusionFor equivalent power steering efficiency,electrically assisted power steering improves fuel consumption by4percent or more compared to conventional hydraulic systems.The elimina-tion of hydraulicfluids is also more environmentally friendly for End of Life Vehicle(ELV) consideration.Electronic data management(wheel angle,vehicle speed,etc.)can be used to fine-tune the power steering parameters,enhancing the car drivability.Variable steering assis-tance improves drivability and active safety.Steering force feedback incorporates controlled re-centre positioning of the steering wheel and active damping of highway vibration.590R Murugan,S Nandakumar and M S MohiyadeenReferencesBPRA055March1997DSP Solutions for BLDC motors.Literature number:Texas Instruments Europe Chan Lie-Tong Yan F,Shao-Yuan Fang2002In-Wheel permanent-magnet brushless dc motor drive for an electric bicycle.IEEE Trans.Energy Conversion17(2):229–232Chu C L,Tsai M C,Chen H Y2001Torque control of brushless DC motors applied to electric vehicles.IEEE Trans.on1–5Desai,Ali Emadi2005A novel digital control technique for brushless DC motor drives:Current control.IEEE Trans.326–331Jun-Uk Chu,In-Hyuk Moon,Gi-Won Choi,Jei-Cheong Ryu,Mu-Seong Mun2004Design of BLDC motor controller for electric power wheelchair.IEEE Trans.94–95Kevin E Brown,Rafael M Inigo,Barry W Johnson1990Design,implementation,and testing of an adaptable optimal controller for an electric wheelchair.IEEE Trans.on Industry Application26(6): 1144–1157NamhunKim,Hamid A Toliyat,Issa M Panahi,Min-Huei Kim2007BLDC motor control algorithm for low-cost industrial applications.IEEE Trans.1400SPRU160C June1999TMS320F/C24x DSP Controllers reference guide CPU and instruction set.Literature number:Texas Instruments EuropeSPRU161C June1999TMS320F/C240DSP controllers reference guide—Peripheral library and spe-cific devices.Literature number:Texas Instruments Europe。
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
6 of 6clocks, timers, interrupts, controls, a full duplex serial port, and +5V, +12V, and -12V power supplies.The host interface to each processor card supports 8Mbaud serial communications for downloading software and other system functions. The host can select which processor it con-nects with by writing to an 8-bit card-select register in the NB-bus master. Once every millisecond, the NBbus master writes this processor card address out to the NBbus. The addressed card then enables serial communications to that processor until another card is selected. In order to allow processor cards to re-quest a connection with the host, a single host request line runs to all of the processors. If this line is asserted, the host individ-ually polls each card until it finds the one requesting a connec-tion. Because the processor card address is only 8 bits wide, the system is limited to 256 cards. For implementations requiring a more than 10 cards, electrically isolated groups of cards can be logically connected by bidirectional bus repeaters. The overall system configuration is shown in Fig.6.Fig. 6. S YSTEM B LOCK D IAGRAMHigh level control generated by a host computer is passed to the AnthroformNeural Controller through a serial connection to the NBbus master. In addi-tion, this serial interface allows for downloading meta-neuron programs intothe processors and configuring the address sequence generated by the NBbusmaster.VI. C ONCLUSION AND F UTURE W ORKThe Anthroform Neural Controller is a distributed computa-tion system based on the human moto-sensory system residing in the spinal cord. At the time of this writing, the controller hardware design was complete and approximately half of the processor card logic had been fully tested. With this tool com-plete, we plan to interface it to the Anthroform Arm Manipula-tor and begin our research of neural control theories relevant to the reflexive functions of spinal circuits. In addition to its in-tended use, we are also considering using this parallel comput-ing architecture for controlling more typical robotic manipula-tors or for doing feedforward readout of standard neural net-work architectures.A CKNOWLEDGMENTWe gratefully acknowledge the efforts of Ching-Ping Chou, Pierre-Henry Marbot, and Professor Arun Somani for their help in defining the NBbus. We also thank Texas Instruments for their generous contributions.R EFERENCES[1]S.C. Jacobsen, E.K. Iversen, D.F. Knutti, R.T. Johnson, and K.B. Big-gers, “Design of the Utah/MIT dexterous hand,” Proc. IEEE Intl. Conf.on Robotics & Automation, pp. 1520-31, San Francisco, April, 1986. [2]Cover of “Presence: Teleoperators and Virtual Environments,” T.B.Sheridan, and T.A. Furness III, Ed., MIT Press, Winter 1992.[3] B. Hannaford, and J.M. Winters, “Actuator properties and movementcontrol: biological and technological models,” In “Multiple Muscle Sys-tems,” J.M. Winters, Ed., Springer Verlag, 1990.[4]H.F. Schulte, “The Characteristics of the McKibben Artificial Muscle,”Appendix H of “The Application of External Power in Prosthetics and Orthotics,” National Academy of Sciences - National Research Council, Washington, D.C., 1961.[5] E. Kandel, and J.H. Schwartz, “Principles of Neural Science,” Elsevier,New York, 1981.[6] A.L. Hodgkin, and A.F. Huxley, “A Quantitative Description of Mem-brane Currents and its Application to Conduction and Excitation in Nerve,” J. Physiology, vol. 117, pp. 500-544, 1952.5 of 6quence can be altered to increase the update rate of some con-nections over others. For instance, by using a sequence like1,9,2,9,3,9,4,9, etc., address 9 would be updated every 2 micro-seconds while the others were updated every 1 millisecond.V. H ARDWARE D ETAILSA circuit card on the NBbus contains a processor, NBbus in-terface hardware, and ports for connecting to input and output devices and a workstation for high level control and system functions. In this section we describe these interfaces as well as the important features of the processor we chose and how they all work together to form a complete system.The TMS320C30 Digital Signal Processor (DSP) was cho-sen as the main processing unit for this system. It is specialized for performing high-speed floating-point calculations appro-priate to our expected meta-neuron algorithms, and is support-ed by a C compiler and an extensive debugging system. The processor runs at 16MIPS, allowing roughly eight typical meta-neurons (with 100 inputs) to be hosted by each process-ing card. The TMS320C30 also has three independent external busses allowing easy interfacing to local memory, the NBbus, and I/O devices as is shown in Fig.5. The DSP also interfaces to a PC based software development system through a dedicat-ed emulator port, and has another high-speed (8Mbaud) serial port appropriate for communicating with a host computer. The NBbus itself supports no I/O functions, so all sensory input and motor output must be interfaced through a meta-neu-ron on a processor card. To prepare the processor cards to pro-vide a wide range of I/O formats, a generic I/O interface con-nector and bus were defined. It allows separately designed I/O daughter cards (e.g., analog to digital converters, digital to an-be easily connected to a processor card. The connector defini-tion allows for daughter cards to be stacked in order to have more than one connected to each processor card. The daughter4 of 6degree-of-freedom, a total of 42 meta-neurons will be needed. If we allow for some additional spinal circuits and a few meta-neurons to interpret higher level control that would bring the total to around N=100. This works out to about 1MIPS per meta-neuron for a system total of 100MIPS. Although this is a little beyond the power of today’s readily available proces-sors, the computations can be easily partitioned among several processors since the computational load per meta-neuron is quite reasonable. The fully-connected network of 100 meta-neurons, for example, could be realized on just 10 processors running at 10MIPSThis leaves us with the nontrivial problem of distributing all the meta-neuron output values to the proper processors where they are needed as inputs. A digital bus is appropriate, but based on the constraints developed in this section and the pre-vious one it should be specialized to:•support a fully connected network that is ex-pandable to hundreds of meta-neurons dis-tributed over a smaller number of proces-sors;•allow easy reconfiguration if a meta-neuronneeds to be moved to different processors;•allow each connection to be updated every1millisecond;•support a programmable delay for each con-nection; and•require a minimum amount of processoroverhead.These constraints lead to the definition of the Neural Broad-cast Bus in the following section, and (in combination with the processor requirements above) to the final definition of the pro-cessing hardware in Section V.IV. T HE N EURAL B ROADCAST B USThe Neural Broadcast Bus, or NBbus, is a special purpose digital bus which forms a fully connected network of up to 1024 meta-neurons and allows them to be distributed arbitrari-ly over a maximum of 256 processors. The data values passed by this bus are 32-bit floating point numbers which represent the average activity of the individual neurons that a meta-neu-ron is modeling (Fig.3).In order to limit the number of physical wires required to im-plement the bus, a broadcast scheme was used. The scheme functions like this: a single bus master consecutively generates all the meta-neuron address numbers between 0 and 1023, re-peating this cycle once every millisecond. Each address is sent out over a common set of address lines to every processor in the system. The processor that hosts6 the addressed meta-neu-ron then places the meta-neuron output value on a common set of data lines. Any processor that needs the output of the ad-dressed meta-neuron (in order to use it as an input for meta-neurons that ithosts) just reads the data bus to acquire the out-glion meta-neuron for feedback, and one ‘Ia’ inhibitory interneuron to ensure reciprocal inhibition of antagonist alpha motor meta-neurons.6. To “host” a meta-neuron is to run that meta-neuron’s algorithm and gen-erate its output value.put value. As long as a complete cycle of all the addresses is accomplished every millisecond, then all 1,047,552 individual input-output pairs are updated every millisecond resulting in a system capable of over 1 billion connections per second. In ad-dition, the system is very flexible since a meta-neuron may be moved to a different processor without reconfiguring the bus. To relocate a meta-neuron, the new host processor need only know that meta-neuron’s address and the addresses of its in-puts.There are only two major drawbacks to this scheme:1.there is no provision for emulating variable de-lays between meta-neurons; and2.each processor must be constantly monitoringthe NBbus address lines so it can read and writethe appropriate values for the meta-neurons ithosts.The first problem could be solved by placing a variable depth FIFO (First In First Out) memory between the processor and the NBbus. However, to support 1024 inputs using 8-bit-wide memories would require 128 chips per processor! Our solution is to use software ring buffers as part of each meta-neuron pro-cessing algorithm. This will require roughly 10 additional in-structions per input; doubling the processing power required per meta-neuron to:However, emulating input delays in software will provide sub-stantial flexibility for tuning meta-neuron circuits.To reduce processor overhead in maintaining the NBbus we designed a special hardware interface to automate the reading and writing of meta-neuron output values. A dual-ported mem-ory7 is inserted between the processor and the NBbus. This memory contains a 32-bit word for each of the 1024 possible meta-neuron addresses and a 1-bit flag that notes which of these addresses are being hosted by the processor. As NBbus addresses arrive, a state machine implemented in a program-mable logic array (PLA) first checks the flag bit, and then reads or writes the NBbus appropriately. All the processor needs to do is read the dual-port (without any timing constraints) and it will have the most recent update of a meta-neuron’s output val-ue. The processor can then run a meta-neuron algorithm on this and other inputs and write the new output value to the proper word in the dual-port. The state machine will take care of writ-ing that output to the NBbus when the proper meta-neuron ad-dress next appears. From a software point of view, the entire NBbus can be thought of as a simple storage space for meta-neuron outputs. This abstraction is shown graphically in Fig.4. During neural emulations, the NBbus master will simply in-crement through all 1024 meta-neuron addresses, completing the cycle every millisecond. If the NBbus were to be used for other computation tasks this format can be changed easily. Byreducing the total number of NBbus addresses the cycle rate control lines that each access the same memory words. If both ports try to ac-cess the same word at the same time then one port is required to wait until the other finishes its access.N1−()inputsmeta-neuron10calc-instrinput10delay-instrinput+()×1msecN50MIPSmeta-neuron()≅3 of 6 human motor system. Each one processes a large number of in-puts to generate an output signal that it passes on to many otherneurons as their inputs. The number of these interconnectionsis very large, and this high connectivity is believed to be an im-portant feature of biological control systems.The interneuron communication medium imposes addition-al constraints on biological control systems. A complex elec-trochemical process is involved in a neuron firing its outputand in propagating this signal to another neuron [6]. The timeit takes to reset this process after firing is called the refractoryperiod, and it lasts about 1millisecond. Therefore, a fully acti-vated neuron will transmit a 1kHz pulse stream. This frequen-cy reduces considerably during inhibition. Since a neuron canonly fire once every millisecond, and since the individual fir-ings are identical, the maximum rate of information flow in anyneural connection is therefore about one value per millisecond.Another feature of neural communication is the propagationdelay of neuron outputs. This delay is proportional to thelength and type of the neuron, and can vary from nearly zero toas much as several tens of milliseconds for long sensory fibers.These connectivity and information flow constraints were cen-tral to the definition of our Neural Broadcast Bus (seeSection IV).The representation we chose for neuron emulation had todeal with yet another important feature of the human nervoussystem: its massive parallelism. Although thousands of neu-rons may be involved in even the simplest human reflex, our limited knowledge of their individual functions requires us to lump them into far fewer emulation units. For example, it is be-yond our current understanding to model the many alpha motor neurons driving the many muscle fibers of a single muscle. We therefore combine these neurons into a single emulation unit3. We call these emulation units meta-neuron s; they may model any number of actual neurons that work in parallel. In order to minimize the loss of information in the abstraction, the values passed between meta-neurons should represent the average ac-tivity of the individual neurons they emulate. The meta-neuron concept is illustrated graphically in Fig.3.III. A RCHITECTURAL I SSUESThe design philosophy we have adhered to in the develop-ment of this project is1.to design a human arm replica and control systemthat is based on known facts from human biome-chanics and neurophysiology; and2.to create engineering solutions to fill gaps thatexist in this knowledge and make the system trac-table.The known facts and engineering solutions presented in the previous section go a long way toward defining the Anthro-form Neural Controller hardware. In this section we outline the remaining system constraints; namely the end uses and result-ing computational requirements of our system.The Anthroform Neural Controller must operate as a testbed for the development of meta-neuron algorithms. This re-represented many alpha motor neurons working in parallel.quires that the processor itself:•be capable of running complex algorithmsthat generate new meta-neuron output val-ues at a rate of 1 kHz (approximating the up-per bound of the biological system);•be readily programmable in a high level lan-guage (for development of meta-neuron al-gorithms); and•be able to interface with the AnthroformArm Manipulator (for sensory inputs andmotor outputs).In order to understand the computational requirements of the above constraints we must first evaluate roughly how much computation per meta-neuron is required. We estimate that meta-neurons will require on the order of 10 “typical” proces-sor instructions per input in order to calculate the output value. If the system is to fully interconnect N meta-neurons4, each one will have N-1 inputs. Since each meta-neuron output must be calculated from these inputs once every 1millisecond, the processing load will be approximately:In order to implement the stretch reflex of Fig.2 we will need three meta-neurons for every muscle5. Assuming 7 de-grees-of-freedom, and an antagonist pair of muscles for each support a fully-connected network in which each node has a direct connection to every other.Fig. 3. T HE M ETA-N EURON A BSTRACTIONHundreds of individual neurons often work in parallel to perform a singlemacroscopic function. The hundreds of individual alpha motor neuronswhich drive a muscle (A), for instance, can be modeled by a single meta-neu-ron (B) that is responsible for driving that muscle. To minimize the loss ofinformation in the abstraction, the output of a meta-neuron represents the av-erage activity of the individual neurons it emulates.(A)(B)N1−()inputsmeta-neuron10instrinput×1msecN100MIPSmeta-neuron()≅1MIPS Millions'of'Instructions'Per'Second()106instrsec()103instrmsec()==the physical structure of the biological system.In order to free the processors for execution of neuron mod-els, connections between simulated neurons are maintained in hardware over a special purpose bus that we call the Neural Broadcast Bus, or NBbus. At first, these neuron models will provide only low-level reflexive control; higher level control signals will be provided by an NBbus interface to a worksta-tion. However, in the future we hope to better model high level control with an adaptive motor program generator. The overall system structure and its mapping to the biological system is shown in Fig.1. This paper focuses on the development of the Anthroform Neural Controller hardware. The software neuron models that run on this hardware will be the subject of future research.II. N EUROPHYSIOLOGY B ACKGROUNDIn this section we introduce some important features of a typical neural circuit 1 in order to gain an understanding of the end use of the Anthroform Neural Controller. We then go on to outline the structural characteristics of the human nervous sys-tem and how they guided the development of our design.The stretch reflex 2 shown in Fig.2 is an important neural circuit, and one we plan to emulate on the Anthroform Neural Controller. This spinal reflex contributes to posture mainte-nance by resisting muscle length perturbations. It functions as follows: when the joint is displaced by an external force, the re-sulting elongation of the many spindles within a muscle causes the associated afferent dorsal root ganglia neurons to fire.These signals are propagated up to a spinal cord segment“neural network ” now connotes a computation method.2. Informally referred to as the “knee jerk” reflex.Fig. 1. F ROM B IOLOGY TO THE A NTHROFORM B IOROBOTIC A RMThe current design of the Anthroform Neural Controller will model the ac-tivity of the human nervous system residing in the spinal cord. Higher level control will be provided by an interface to a workstation.rons associated with the perturbed muscle. In addition, the af-ferent signals activate the motor neurons of synergist muscles and the ‘Ia’ inhibitory interneurons. The interneurons in turn inhibit the hundreds of alpha motor neurons of the antagonist muscles. The activated and inhibited motor neurons then pass their signals to the appropriate muscle fibers within the mus-cles. The net result is to contract the stretched muscle and its synergists, and relax the antagonist muscles, thereby resisting the perturbation. As a result of being processed locally in the spinal cord, the reaction time of this reflex is on the order of 40milliseconds (as opposed to 120+ milliseconds for signals passing through conscious control in the cortex) [5].The neurons described above are the basic foundation of the= Dorsal Root Ganglia Neurons: bring in afferent sensory signals = Alpha Motor Neurons: send out efferent signals to muscle fibers Fig. 2. T HE S TRETCH R EFLEXStretching a muscle with an external force (1) causes the spindles to activate neurons in the dorsal root ganglia (2). These neurons then activate the alpha motor neurons of the same muscle (3) and those of the synergist muscle(s)(4) as well as activating the ‘Ia’ inhibitory interneurons (5). The interneurons then inhibit the alpha motor neurons of the antagonist muscle(s) (6). The stretch is thereby resisted (7) as the flexor muscles tighten and the extensor muscle relaxes. Each of the neuron symbols in the figure represents hundreds of individual neurons acting in parallel.Abstract—Existing robotic manipulator and controller designs compare unfavorably to the human arm when performing tasks in unstructured environments. So-called “anthropomorphic” de-signs have tried to improve robot performance in these domains by replicating the kinematic structure of the human arm while continuing to use traditional actuation and control techniques. In this paper we describe a versatile parallel computing architecture for emulating the spinal circuits of the human nervous system. When used in conjunction with a dynamically realistic replica of the human arm, this controller will provide a versatile tool for studying human moto-sensory control. The design is based on the structural constraints of the nervous system, and consists of a spe-cial purpose digital bus which implements connections between simulated neurons running on TMS 320C30 digital signal proces-sors (DSPs). The system supports up to 1024 individual neuron models, each connected to every other at least once every millisec-ond. These neuron models may be distributed over as many as 256 processor circuit cards, each supporting an interface for high lev-el control from a host and another for input and output functions.I. I NTRODUCTIONStandard industrial robot manipulator arms have been de-signed for precise positioning in highly structured and con-strained environments. The development of robotic and telero-botic systems for use in unstructured environments has there-fore been a significant challenge, and not altogether a success. Because the human arm performs so well in these domains,“anthropomorphic” designs have become increasingly popular. Such designs have traditionally focused on replicating the ki-nematic relationships of manipulator links and joints, and have neglected actuator, dynamic, and control aspects [1][2].The Anthroform Biorobotic Arm project is an attempt to build an anthropomorphic robot such that both the manipulator and its controller are based on current knowledge of human biomechanics and neurophysiology. The project is divided into two subprojects; the Anthroform Arm Manipulator and the An-throform Neural Controller. The primary goal in the develop-ment of these subprojects is to produce an accurate test bed for studying theories of neural control. In addition, however, the Anthroform Biorobotic Arm will represent a very humanlike manipulator with “natural” kinematics and dynamics, allowing possible applications ranging from teleoperation to prosthetic limbs.Successful control of the Anthroform Arm Manipulator by a neuron model implemented on the Anthroform Neural Con-troller will be a necessary (but not sufficient) condition to show that the model is indeed descriptive of a function of the human motor control system. However, this is only true to the extent that the arm’s dynamics accurately model the human muscu-loskeletal system. For the Anthroform Biorobotic Arm to be useful as a test bed for neural control models, it is crucial that it have a high level of biomechanical accuracy. To attain this accuracy, Prof. Jack Winters of Catholic University of America [3] is using the elements of Table1 below to develop our hu-man arm replica. All of these elements are attached at anatom-ically appropriate positions, and each has static and dynamic characteristics which make it a good model of the biological equivalent. Together they form a highly anthropomorphic 7 de-gree-of-freedom manipulator.The Anthroform Neural Controller is the control technology counterpart to the Anthroform Arm Manipulator. Through a combination of specialized hardware and software, it serves to simulate the activity of spinal neurons and their interconnec-tions. The major elements of the controller architecture are shown in Table2 below. The hardware design constraints were defined by the known (and relatively invariant) structural con-straints of the human nervous system (e.g., the connectivity and rate of information flow). In this way, the constraints im-posed by the hardware in our system match those imposed byLEMENTS OF THE NTHROFORM RM ANIPULATORLEMENTS OF THE NTHROFORM EURAL ONTROLLERThe Anthroform Neural Controller:A System for Detailed Emulation of Neural CircuitsIan MacDuff, Steven Venema, and Blake HannafordBiorobotics Laboratory, Dept. of Electrical Engineering, FT-10University of Washington, Seattle, W A 98195Email: blake@This research was supported by the National Science Foundation Presiden-tial Young Investigator Award and by the Office of Naval Research.。