May13

Page 1

INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013

UK: Managing Editor International Journal of Innovative Technology and Creative Engineering 1a park lane, Cranford London TW59WA UK E-Mail: editor@ijitce.co.uk Phone: +44-773-043-0249 USA: Editor International Journal of Innovative Technology and Creative Engineering Dr. Arumugam Department of Chemistry University of Georgia GA-30602, USA. Phone: 001-706-206-0812 Fax:001-706-542-2626 India: Editor International Journal of Innovative Technology & Creative Engineering Dr. Arthanariee. A. M Finance Tracking Center India 17/14 Ganapathy Nagar 2nd Street Ekkattuthangal Chennai -600032 Mobile: 91-7598208700

www.ijitce.co.uk

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013

IJITCE PUBLICATION

INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY & CREATIVE ENGINEERING Vol.3 No.5 May 2013

www.ijitce.co.uk

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013

From Editor's Desk Dear Researcher, Greetings! Research article in this issue discusses about Categorization of At 900 MHz carrier frequency equalizing the Doppler effect, GPS – IMU based Autonomous Target Tracking Algorithm. Let us review research around the world this month; Passenger plane flies 800 kilometers without a pilot. The aircraft – a 19-seat propeller-powered business plane – was not merely on autopilot. It tested the detect-and-avoid technology, which drones in civil airspace will need to have to ensure they keep their distance from other air traffic and automatically undertake collision-avoidance manoeuvres. The algorithm that runs this technology has been thrashed out with air-safety experts at the UK Civil Aviation Authority who have ensured it sticks to the "rules of the air" understood by pilots worldwide. Robot mimics flying fish. Alexis Lussier Desbiens and colleagues at Stanford University in California created the robot, which is 30 centimetres long and has a battery charged by a solar cell. Its motor compresses a lightweight carbon-fibre spring, which when released flings the robot into the air. The robot's wing then pivots to maximise lift. At the peak of its leap, the wing flattens out to prolong glide time. In tests, the Jump Glider managed to fly about 5 meters per hop. Silver nano-particles provide clean water for $2 a year. Silver nano-particles may be the key to supplying clean, affordable drinking water worldwide. Thalappil Pradeep at the Indian Institute of Technology in Chennai and colleagues have developed a filter based on an aluminium composite, embedded with silver nanoparticles. As water flows through the filter, the nano-particles are oxidised and release ions, which kill viruses and bacteria, and neutralise toxic chemicals such as lead and arsenic. Some nano-particles leach into the water but at concentrations that pose no threat to health. Pradeep describes the process of making the filter as "water positive": 1 litre of water spent on making nanoparticles gives 500 litres of clean water. It has been an absolute pleasure to present you articles that you wish to read. We look forward to many more new technologies related research articles from you and your friends. We are anxiously awaiting the rich and thorough research papers that have been prepared by our authors for the next issue. Thanks, Editorial Team IJITCE

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013

Editorial Members Dr. Chee Kyun Ng Ph.D Department of Computer and Communication Systems, Faculty of Engineering, Universiti Putra Malaysia,UPM Serdang, 43400 Selangor,Malaysia. Dr. Simon SEE Ph.D Chief Technologist and Technical Director at Oracle Corporation, Associate Professor (Adjunct) at Nanyang Technological University Professor (Adjunct) at Shangai Jiaotong University, 27 West Coast Rise #08-12,Singapore 127470 Dr. sc.agr. Horst Juergen SCHWARTZ Ph.D, Humboldt-University of Berlin, Faculty of Agriculture and Horticulture, Asternplatz 2a, D-12203 Berlin, Germany Dr. Marco L. Bianchini Ph.D Italian National Research Council; IBAF-CNR, Via Salaria km 29.300, 00015 Monterotondo Scalo (RM), Italy Dr. Nijad Kabbara Ph.D Marine Research Centre / Remote Sensing Centre/ National Council for Scientific Research, P. O. Box: 189 Jounieh, Lebanon Dr. Aaron Solomon Ph.D Department of Computer Science, National Chi Nan University, No. 303, University Road, Puli Town, Nantou County 54561, Taiwan Dr. Arthanariee. A. M M.Sc.,M.Phil.,M.S.,Ph.D Director - Bharathidasan School of Computer Applications, Ellispettai, Erode, Tamil Nadu,India Dr. Takaharu KAMEOKA, Ph.D Professor, Laboratory of Food, Environmental & Cultural Informatics Division of Sustainable Resource Sciences, Graduate School of Bioresources, Mie University, 1577 Kurimamachiya-cho, Tsu, Mie, 514-8507, Japan Mr. M. Sivakumar M.C.A.,ITIL.,PRINCE2.,ISTQB.,OCP.,ICP Project Manager - Software, Applied Materials, 1a park lane, cranford, UK Dr. Bulent Acma Ph.D Anadolu University, Department of Economics, Unit of Southeastern Anatolia Project(GAP), 26470 Eskisehir, TURKEY Dr. Selvanathan Arumugam Ph.D Research Scientist, Department of Chemistry, University of Georgia, GA-30602, USA.

Review Board Members Dr. Paul Koltun Senior Research ScientistLCA and Industrial Ecology Group,Metallic & Ceramic Materials,CSIRO Process Science & Engineering Private Bag 33, Clayton South MDC 3169,Gate 5 Normanby Rd., Clayton Vic. 3168, Australia Dr. Zhiming Yang MD., Ph. D. Department of Radiation Oncology and Molecular Radiation Science,1550 Orleans Street Rm 441, Baltimore MD, 21231,USA Dr. Jifeng Wang Department of Mechanical Science and Engineering, University of Illinois at Urbana-Champaign Urbana, Illinois, 61801, USA Dr. Giuseppe Baldacchini ENEA - Frascati Research Center, Via Enrico Fermi 45 - P.O. Box 65,00044 Frascati, Roma, ITALY. Dr. Mutamed Turki Nayef Khatib Assistant Professor of Telecommunication Engineering,Head of Telecommunication Engineering Department,Palestine Technical University (Kadoorie), Tul Karm, PALESTINE. Dr.P.Uma Maheswari Prof & Head,Depaartment of CSE/IT, INFO Institute of Engineering,Coimbatore.

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 Dr. T. Christopher, Ph.D., Assistant Professor & Head,Department of Computer Science,Government Arts College(Autonomous),Udumalpet, India. Dr. T. DEVI Ph.D. Engg. (Warwick, UK), Head,Department of Computer Applications,Bharathiar University,Coimbatore-641 046, India. Dr. Renato J. orsato Professor at FGV-EAESP,Getulio Vargas Foundation,São Paulo Business School,Rua Itapeva, 474 (8° andar),01332-000, São Paulo (SP), Brazil Visiting Scholar at INSEAD,INSEAD Social Innovation Centre,Boulevard de Constance,77305 Fontainebleau - France Y. Benal Yurtlu Assist. Prof. Ondokuz Mayis University Dr.Sumeer Gul Assistant Professor,Department of Library and Information Science,University of Kashmir,India Dr. Chutima Boonthum-Denecke, Ph.D Department of Computer Science,Science & Technology Bldg., Rm 120,Hampton University,Hampton, VA 23688 Dr. Renato J. Orsato Professor at FGV-EAESP,Getulio Vargas Foundation,São Paulo Business SchoolRua Itapeva, 474 (8° andar),01332-000, São Paulo (SP), Brazil Dr. Lucy M. Brown, Ph.D. Texas State University,601 University Drive,School of Journalism and Mass Communication,OM330B,San Marcos, TX 78666 Javad Robati Crop Production Departement,University of Maragheh,Golshahr,Maragheh,Iran Vinesh Sukumar (PhD, MBA) Product Engineering Segment Manager, Imaging Products, Aptina Imaging Inc. Dr. Binod Kumar PhD(CS), M.Phil.(CS), MIAENG,MIEEE HOD & Associate Professor, IT Dept, Medi-Caps Inst. of Science & Tech.(MIST),Indore, India Dr. S. B. Warkad Associate Professor, Department of Electrical Engineering, Priyadarshini College of Engineering, Nagpur, India Dr. doc. Ing. Rostislav Choteborský, Ph.D. Katedra materiálu a strojírenské technologie Technická fakulta,Ceská zemedelská univerzita v Praze,Kamýcká 129, Praha 6, 165 21 Dr. Paul Koltun Senior Research ScientistLCA and Industrial Ecology Group,Metallic & Ceramic Materials,CSIRO Process Science & Engineering Private Bag 33, Clayton South MDC 3169,Gate 5 Normanby Rd., Clayton Vic. 3168 DR.Chutima Boonthum-Denecke, Ph.D Department of Computer Science,Science & Technology Bldg.,Hampton University,Hampton, VA 23688 Mr. Abhishek Taneja B.sc(Electronics),M.B.E,M.C.A.,M.Phil., Assistant Professor in the Department of Computer Science & Applications, at Dronacharya Institute of Management and Technology, Kurukshetra. (India). Dr. Ing. Rostislav Chotěborský,ph.d, Katedra materiálu a strojírenské technologie, Technická fakulta,Česká zemědělská univerzita v Praze,Kamýcká 129, Praha 6, 165 21

Dr. Amala VijayaSelvi Rajan, B.sc,Ph.d, Faculty – Information Technology Dubai Women’s College – Higher Colleges of Technology,P.O. Box – 16062, Dubai, UAE

Naik Nitin Ashokrao B.sc,M.Sc Lecturer in Yeshwant Mahavidyalaya Nanded University

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 Dr.A.Kathirvell, B.E, M.E, Ph.D,MISTE, MIACSIT, MENGG Professor - Department of Computer Science and Engineering,Tagore Engineering College, Chennai Dr. H. S. Fadewar B.sc,M.sc,M.Phil.,ph.d,PGDBM,B.Ed. Associate Professor - Sinhgad Institute of Management & Computer Application, Mumbai-Banglore Westernly Express Way Narhe, Pune - 41 Dr. David Batten Leader, Algal Pre-Feasibility Study,Transport Technologies and Sustainable Fuels,CSIRO Energy Transformed Flagship Private Bag 1,Aspendale, Vic. 3195,AUSTRALIA Dr R C Panda (MTech & PhD(IITM);Ex-Faculty (Curtin Univ Tech, Perth, Australia))Scientist CLRI (CSIR), Adyar, Chennai - 600 020,India Miss Jing He PH.D. Candidate of Georgia State University,1450 Willow Lake Dr. NE,Atlanta, GA, 30329 Jeremiah Neubert Assistant Professor,Mechanical Engineering,University of North Dakota Hui Shen Mechanical Engineering Dept,Ohio Northern Univ. Dr. Xiangfa Wu, Ph.D. Assistant Professor / Mechanical Engineering,NORTH DAKOTA STATE UNIVERSITY Seraphin Chally Abou Professor,Mechanical & Industrial Engineering Depart,MEHS Program, 235 Voss-Kovach Hall,1305 Ordean Court,Duluth, Minnesota 55812-3042 Dr. Qiang Cheng, Ph.D. Assistant Professor,Computer Science Department Southern Illinois University CarbondaleFaner Hall, Room 2140-Mail Code 45111000 Faner Drive, Carbondale, IL 62901 Dr. Carlos Barrios, PhD Assistant Professor of Architecture,School of Architecture and Planning,The Catholic University of America Y. Benal Yurtlu Assist. Prof. Ondokuz Mayis University Dr. Lucy M. Brown, Ph.D. Texas State University,601 University Drive,School of Journalism and Mass Communication,OM330B,San Marcos, TX 78666 Dr. Paul Koltun Senior Research ScientistLCA and Industrial Ecology Group,Metallic & Ceramic Materials CSIRO Process Science & Engineering Dr.Sumeer Gul Assistant Professor,Department of Library and Information Science,University of Kashmir,India Dr. Chutima Boonthum-Denecke, Ph.D Department of Computer Science,Science & Technology Bldg., Rm 120,Hampton University,Hampton, VA 23688 Dr. Renato J. Orsato Professor at FGV-EAESP,Getulio Vargas Foundation,S찾o Paulo Business School,Rua Itapeva, 474 (8째 andar)01332-000, S찾o Paulo (SP), Brazil Dr. Wael M. G. Ibrahim Department Head-Electronics Engineering Technology Dept.School of Engineering Technology ECPI College of Technology 5501 Greenwich Road - Suite 100,Virginia Beach, VA 23462

Dr. Messaoud Jake Bahoura Associate Professor-Engineering Department and Center for Materials Research Norfolk State University,700 Park avenue,Norfolk, VA 23504

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 Dr. V. P. Eswaramurthy M.C.A., M.Phil., Ph.D., Assistant Professor of Computer Science, Government Arts College(Autonomous), Salem-636 007, India. Dr. P. Kamakkannan,M.C.A., Ph.D ., Assistant Professor of Computer Science, Government Arts College(Autonomous), Salem-636 007, India. Dr. V. Karthikeyani Ph.D., Assistant Professor of Computer Science, Government Arts College(Autonomous), Salem-636 008, India. Dr. K. Thangadurai Ph.D., Assistant Professor, Department of Computer Science, Government Arts College ( Autonomous ), Karur - 639 005,India. Dr. N. Maheswari Ph.D., Assistant Professor, Department of MCA, Faculty of Engineering and Technology, SRM University, Kattangulathur, Kanchipiram Dt - 603 203, India. Mr. Md. Musfique Anwar B.Sc(Engg.) Lecturer, Computer Science & Engineering Department, Jahangirnagar University, Savar, Dhaka, Bangladesh. Mrs. Smitha Ramachandran M.Sc(CS)., SAP Analyst, Akzonobel, Slough, United Kingdom. Dr. V. Vallimayil Ph.D., Director, Department of MCA, Vivekanandha Business School For Women, Elayampalayam, Tiruchengode - 637 205, India. Mr. M. Moorthi M.C.A., M.Phil., Assistant Professor, Department of computer Applications, Kongu Arts and Science College, India Prema Selvaraj Bsc,M.C.A,M.Phil Assistant Professor,Department of Computer Science,KSR College of Arts and Science, Tiruchengode Mr. G. Rajendran M.C.A., M.Phil., N.E.T., PGDBM., PGDBF., Assistant Professor, Department of Computer Science, Government Arts College, Salem, India. Dr. Pradeep H Pendse B.E.,M.M.S.,Ph.d Dean - IT,Welingkar Institute of Management Development and Research, Mumbai, India Muhammad Javed Centre for Next Generation Localisation, School of Computing, Dublin City University, Dublin 9, Ireland Dr. G. GOBI Assistant Professor-Department of Physics,Government Arts College,Salem - 636 007 Dr.S.Senthilkumar Post Doctoral Research Fellow, (Mathematics and Computer Science & Applications),Universiti Sains Malaysia,School of Mathematical Sciences, Pulau Pinang-11800,[PENANG],MALAYSIA. Manoj Sharma Associate Professor Deptt. of ECE, Prannath Parnami Institute of Management & Technology, Hissar, Haryana, India RAMKUMAR JAGANATHAN Asst-Professor,Dept of Computer Science, V.L.B Janakiammal college of Arts & Science, Coimbatore,Tamilnadu, India Dr. S. B. Warkad Assoc. Professor, Priyadarshini College of Engineering, Nagpur, Maharashtra State, India Dr. Saurabh Pal Associate Professor, UNS Institute of Engg. & Tech., VBS Purvanchal University, Jaunpur, India Manimala Assistant Professor, Department of Applied Electronics and Instrumentation, St Joseph’s College of Engineering & Technology, Choondacherry Post, Kottayam Dt. Kerala -686579

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 Dr. Qazi S. M. Zia-ul-Haque Control Engineer Synchrotron-light for Experimental Sciences and Applications in the Middle East (SESAME),P. O. Box 7, Allan 19252, Jordan Dr. A. Subramani, M.C.A.,M.Phil.,Ph.D. Professor,Department of Computer Applications, K.S.R. College of Engineering, Tiruchengode - 637215 Dr. Seraphin Chally Abou Professor, Mechanical & Industrial Engineering Depart. MEHS Program, 235 Voss-Kovach Hall, 1305 Ordean Court Duluth, Minnesota 558123042 Dr. K. Kousalya Professor, Department of CSE,Kongu Engineering College,Perundurai-638 052 Dr. (Mrs.) R. Uma Rani Asso.Prof., Department of Computer Science, Sri Sarada College For Women, Salem-16, Tamil Nadu, India. MOHAMMAD YAZDANI-ASRAMI Electrical and Computer Engineering Department, Babol "Noshirvani" University of Technology, Iran. Dr. Kulasekharan, N, Ph.D Technical Lead - CFD,GE Appliances and Lighting, GE India,John F Welch Technology Center, Plot # 122, EPIP, Phase 2,Whitefield Road,Bangalore – 560066, India. Dr. Manjeet Bansal Dean (Post Graduate),Department of Civil Engineering ,Punjab Technical University,Giani Zail Singh Campus, Bathinda -151001 (Punjab),INDIA Dr. Oliver Jukić Vice Dean for education, Virovitica College, Matije Gupca 78,33000 Virovitica, Croatia Dr. Lori A. Wolff, Ph.D., J.D. Professor of Leadership and Counselor Education, The University of Mississippi, Department of Leadership and Counselor Education, 139 Guyton University, MS 38677

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013

Contents At 900 MHz carrier frequency equalizing the Doppler effect in M-QAM at 18dB gain by Sanjeev Kumar Shah, Vinay Negi and Sandeep Singh7..........................................................................................................................[66]

GPS – IMU based Autonomous Target Tracking Algorithm for use in UAS by Jaganathan Ranganathan and William H. Semke7.................................................................................................................................................[71]

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013

At 900 MHz carrier frequency equalizing the Doppler effect in M M-QAM QAM at 18dB gain Sanjeev Kumar Shah #1, Vinay Negi #2, Sa Sandeep ndeep Singh #3 # Electronics and communication, Uttaranchal Institute Of Technology, Dehradun, Uttarakhand, India 1 sanjeevkshah@yahoo.co.in 2 negi.graphicera@gmail.com 3 sandeepnegi80@yahoo.com Abstract—This paper aper describes the calculation and simulation results of the Doppler effect on a mobile car with the help of constellation diagram for 4 QAM modulation when the mobile car experienced the Rayleigh fading. And the equalizer is used to optimize the Doppler effect. ffect. Here LMS Linear equalizer is used to optimize the Doppler effect when the Mobile Car having speed 30 m/sec. and the mobile car is assumed on freeway. The results are taken at three position of mobile car i.e. at an 0 0 0 angle of 5 ,45 and 85 .

A coherent detector is able to independently demodulate these carriers. This principle of using two independently modulated carriers is the foundation of quadrature modulation. In pure phase modulation, the phase of the modulating symbol is the phase of the carrier itself. B. M-Ary Ary Quadrature Amplitude Amplit Modulation

Keywords: 4 QAM modulation, LMS Linear equalizer equalizer, Rayleigh fading, Doppler effect, constellation diagram. aper format, publish, template, sample I. INTRODUCTION A. Constellation diagram Fig.1. constellation diagram for 16 QAM

A constellation diagram is a representation of a signal modulated by a digital modulation scheme such as quadrature amplitude modulation or phase phase-shift keying. It displays the signal as a two-dimensional dimensional scatter diagram in the complex plane at symbol sampling instants. In a more abstract sense, it represents the possible symbols that may be selected by a given modulation scheme as points in the complex plane. Measured constellation diagrams can be used to recognize the type of interference and distortion in a signal. By representing a transmitted symbol as a complex number and modulating ulating a cosine and sine carrier signal with the real and imaginary parts (respectively), the symbol can be sent with two carriers on the same frequency. They are often referred to as quadrature carriers.

Fig.2. constellation diagram of 16-QAM 16 with respective amplitude and phase This constellation is known as square constellation. With an even number of bits per symbol, may have where L is a positive integer. Under this condition, an M-ary M QAM square constellation can always be viewed as the Cartesian product of one dimensional M-ary M PAM

66

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 constellation with itself. Here L=4,so it is a figure of a The relative changes in frequency can be explained Cartesian product of the 4-PAM constellation stellation with itself. A as follows. When the source of the waves is moving few of the other constellations offer slightly better error toward the observer, each successive succes wave crest is performance, but with a much more complicated system emitted from a position closer to the observer than the implementation, like star constellation. previous wave. Therefore each wave takes slightly less time to reach the observer than the previous wave. Therefore the time B. Interpretation between the arrivals of successive wave crests at the Upon reception of the signal, the demodulator examines observer bserver is reduced, causing an increase in the frequency. the received symbol, which may have been corrupted by While they are travelling, the distance between successive the channel or the receiver (e.g. additive white Gaussian wave fronts is reduced; so the waves "bunch together". noise, distortion, phase noise or interference). It selects, Conversely, if the source of waves is moving away from as its estimate of what was actually transmitted, that point the observer, each wave is emitted itted from a position farther on the constellation diagram am which is closest (in a from the observer than the previous wave, so the arrival Euclidean distance sense) to that of the received symbol. time between successive waves is increased, reducing the Thus it will demodulate incorrectly if the corruption has frequency. The distance between successive wave fronts caused the received symbol to move closer to another is increased, so the waves "spread out". constellation point than the one transmitted. This is maximum imum likelihood detection. The constellation diagram allows a straightforward visualization of this process imagine the received symbol as an arbitrary point in the I-Q Q plane and then decide that the transmitted symbol is whichever constellation point is closest to it. For the purpose of analyzing received signal quality, some types of corruption are very evident in the constellation diagram. For example: 1. Gaussian noise shows as fuzzy constellation points 2. Non-coherent coherent single frequency interferenc interference shows as circular constellation points 3. Phase noise shows as rotationally spreading constellation points 4. Attenuation causes the corner points to move towards the center D. Doppler effect The Doppler Effect (or Doppler shift), is the change in frequency uency of a wave (or other periodic event) for an observer moving relative to its source. It is commonly heard when a vehicle sounding a siren or horn approaches, passes, and recedes from an observer. The received frequency is higher (compared to the emitte emitted frequency) during the approach, it is identical at the instant of passing by, and it is lower during the recession.

Fig.3. Doppler pler Effect

67

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 For waves that propagate in a medium, such as III. SIMULATED RESULTS sound waves, the velocity of the observer and of the source is relative to the medium in which the waves are transmitted. The total Doppler Effect may therefore result from motion of the source, motion of the observer, or motion of the medium. Each of these effects is analyzed separately. For waves which do not require a medium, such as light or gravity in general relativity, only the relative difference in velocity between the observer and the source needs to be considered. II. MATHEMATICAL ANALYSIS FOR SIMULATION RESULTS

Fig.4. Constellation diagram of 4 QAM when mobile car is not experienced any fading & dopper effect

Table I Mobile Car having speed 30m/sec on freeway (fd1)

Fig.5. Mobile Car having speed 30m/sec on freeway for angle 5ÍŚ (4 QAM)) without equalizer Rayleigh fading is a reasonable model when there are many objects in the environment that scatter the radio signal nal before it arrives at the receiver. The central limit theorem holds that, if there is sufficiently much scatter, the channel impulse response will be well--modelled as a Gaussian process irrespective of the distribution of the individual components. If there here is no dominant component to the scatter, then such a process will have zero mean and phase evenly distributed between 0 and 2Ď€ radians. The envelope of the channel response will therefore be Rayleigh distributed

68

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013

Fig.8. Mobile Car having speed 30m/sec on freeway for angle 45ͦ (4 QAM)) with equalizer Fig.6. Mobile Car having speed 30m/sec sec on freeway for angle 5ͦ (4 QAM) with equalizer

Fig.9. Mobile Car having speed 30m/sec on freeway for angle 85ͦ (4 QAM)) without equalizer

Fig.7. Mobile Car having speed 30m/sec sec on freeway for angle 45ͦ (4 QAM)) without equalizer

Fig.10. Mobile Car having speed 30m/sec on freeway for angle 85ͦ (4 QAM)) with equalizer

69

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 IV. CONCLUSION REFERENCES This paper show the calculation and simulation results of the Doppler effect on a mobile car with the help of constellation diagram for 4 QAM modulation when the mobile car experienced the Rayleigh fading. And the LMS Linear equalizer is used to optimize the Doppler effect. when the Mobile Car having speed 30 m/sec. and the mobile car is assumed on freeway. the results shows that the distorted constalletion point because of Doppler effect when gain is taken 18 dB and carrier frequency is 900 MHz (i.e. U.S. digital cellular syatem) for each observation. And also the LMS Linear equalizer equalize those distorted constellation point for optimizing the 0 0 0 Doppler effect for every 5 ,45 and 85 .

[1]

[2]

[3]

[4]

[5]

70

BER Performance of Reed-Solomon Code Using Mary FSK Modulation in AWGN Channel, International Journal of Advances in Science and Technology, Vol. 3, No.1, 2011. Difference Threshold Test forM-FSK SignalingWith Reed–Solomon Coding and Diversity Combining in Rayleigh Fading Channels, IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, VOL. 54, NO. 3, MAY 2005. Performance Analysis of Combined Transmit Selection Diversity and Receive Generalized Selection Combining in Rayleigh Fading Channels Xiaodong Cai, Member, IEEE, and Georgios B. Giannakis, Fellow, IEEE, IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, VOL. 3, NO. 6, NOVEMBER 2004. Bit-Error Probabilities of 2 and 4DPSK with Nonselective Rayleigh Fading, Diversity Reception, and Correlated Gaussian Interference,Pooi Yuen Kam, IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. 45, NO. 4, APRIL 1997. T.S.Rappaport Wireless Communication. Prenticehall, Upper SaddleRiver,N.J,1996.

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013

GPS – IMU based Autonomous Target Tracking Algorithm for use in UAS Jaganathan Ranganathan #1 and William H. Semke #2

Abstract:

#1 University of North Dakota, Department of Earth System Science & Policy Clifford Hall Room 333, 4149 University Ave Stop 9011, Grand Forks, ND 58202, USA Jaganathan.ranganathan@gmail.com #2 University of North Dakota, Department of Mechanical Engineering Upson II Room 271, 243 Centennial Drive Stop 8359, Grand Forks, ND 58202, USA william.semke@engr.und.edu I. INTRODUCTION

Abstract—Tracking a ground based target autonomously from an Unmanned Aircraft Systems (UAS) is a critical task for remote sensing or any Intelligence, Surveillance, and Reconnaissance (ISR) mission that requires precision pointing, and effective real-time data transmission/recording to the ground station. To achieve this, the authors introduce a novel non-linear closed form analytical algorithm derived based on coordinate transformation and vector algebra principles that was implemented onboard a small UAS. The Unmanned Aircraft Systems Engineering Laboratory (UASE) at University of North Dakota (UND), Grand Forks, ND developed a small UAV payload “SUNDOG” to demonstrate the autonomous tracking ability of the derived algorithm and its implementations. The major advantage of the algorithm is that, it allows the user to specify and maintains the camera/target orientation irrespective of the aircraft position and rotation, which helps the ground personnel to easily interpret the data effectively while tracking the target location in realtime. The equations provide an elegant closed form solution to a non-linear problem that can be easily and efficiently programmed. The algorithm was verified through several experimentations and demonstrated successfully in an UAS test flight. Actual flight data illustrating the effectiveness of the surveillance algorithm is presented.

The objective of developing an autonomous tracking system is to allow small fixed wing UAS platforms to both station-keep on targets of interest and estimate accurate position information for those targets using onboard assets. Specific targets of interest vary from a stationary point on the ground (applications such as fire-fighting, surveillance, and atmospheric research) to cooperative and non-cooperative aircraft in the national airspace system for airborne sense and avoid ([1]-[2]). This concept is illustrated in Fig. 1. An autonomous tracking algorithm for a three-axis gimbal system using Global Positioning System (GPS) and Inertial Measurement Unit (IMU) information is derived using coordinate transformation and vector algebra principles. The final outputs of the algorithm are the three rotational angles required to point/track at a known stationary/moving target with a prescribed orientation. Irrespective of the aircraft position and its rotation, the derived algorithm tracks any known stationary target on the ground maintaining the user defined orientation. A custom developed C++ programming in a Linux platform was implemented in the payload to provide a communication link with all the sensors such as the GPS, IMU, motion controllers, autopilot, etc., that sends, retrieves, and process information to track the target. It also establishes a wireless communication from the ground station to the aircraft for effective ground control operation. Several experiments have been carried out to verify the tracking ability and the results are presented.

Keywords:Autonomous Tracking Algorithm, Precision Pointing, Multi-Axis Gimbal Tracking, Mini-UAV Payload.

71

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 Flight Path Defined by GPS Waypoints

Precision Pointing at a Single Location or Accurate Target Tracking

Fig. 3. Three-Axis Gimbal System CAD model (Left) and Physical implementation (Right)

Fig. 1. Small UAS Autonomous Tracking System Concept A small UAS imaging payload has been designed and constructed by the Unmanned Aircraft Systems Engineering (UASE) team at the University of North Dakota (UND) that allows for full 3- axis rotation of electrooptical (EO) and uncooled thermal infrared (IR) cameras ([3]-[5]). The first successful flight of this payload using the autonomous target tracking algorithm took place on October 18, 2008, in military restricted airspace over Camp Grafton Training Center, an Army National Guard Maneuver Training center near Devil’s Lake, North Dakota. The payload was flown using the UND-Super Hauler UAV. Fig. 2 shows the electro-optical images captured using the autonomous target tracking algorithm during the actual flight.

Fig. 4.CAD model (Left) and Physical Implementation (Right) of an Autonomous Tracking Payload “SUNDOG”. The SUNDOG payload has the capability to track any known stationary target location autonomously and also manually point the camera towards the target location using a joystick. For precision pointing, a non-linear closed form analytical algorithm was developed to determine the exact rotation angles for a three-axis gimbal system to point a digital imaging sensor at any known target with a prescribed orientation. For autonomous tracking, the calculated rotation angles must be provided to the gimbal system so that it can accurately locate the defined target and track it accordingly as the aircraft moves. The algorithm derived in this paper was successfully implemented and tested in the SUNDOG payload. Since the algorithm uses simple algebraic closed form expressions to calculate the pointing angles of the gimbal, it requires less computational time to track the target when compared to many other algorithms making it more easily implemented in a small UAS with limited resources ([6][10]). The algorithm also maintains the camera orientation to north (user-defined) in the inertial coordinate system. This allows ground personnel to compare the real time video directly to the maps without any orientation misalignment making the data much more easily understandable. The limitation of the tracking algorithm mentioned ([6]-[10]) compared to the algorithm presented

Fig. 2. Electro-Optical images captured while using autonomous target tracking algorithm Fig. 3 shows a CAD model and physical implementation of the three-axis gimbal system design, while Fig. 4 is a CAD model and physical implementation of an autonomous tracking payload known as the SUNDOG – Surveillance by University of North Dakota Observational Gimbal.

72

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 in this paper are, they do not maintain the camera defined with the x-axis axis being the tilt axis of the gimbal and orientation ntation in the specified direction, there are initially aligned with the x-axis axis of the aircraft system. The discontinuities in the gimbal rotation, they are expensive, y-axis axis of the gimbal is the pan axis of the gimbal. The zz and are position and time dependent. axis of the camera is also lso defined as the line-of-sight line axis, which is required to help solve for the pointing and The SUNDOG payload has the capability for a orientation parameters. ground-based based payload operator to manually control the pointing of the he cameras with a joystick based upon real time video data received from the EO camera. For this, an algorithm was developed to determine the target location based on the position, attitude information of the aircraft, and gimbal pointing angles. This paperr focuses only on stationary targets, but the algorithm has the ability to be extended to ground and air borne moving targets. II. ACCURATE GIMBAL POINTING ING ALGORITHMS

Kinematics Analysis of the System A kinematic analysis is done on a three three-axis gimbal system tem to get the appropriate model of gimbal rotations in order to point at a certain location on the ground from an aerial platform. The mathematical model includes an inertial system that has coordinates fixed to the Earth, a coordinate system that is body body-fixed to the aircraft and a third coordinate system that is fixed to the gimbal. The end results of the analysis are the rotation angles, about each gimbal axis, that will result in the gimbal pointing at the correct spot with a desired orientation. Correct ct orientation will allow an image to have the same orientation to the ground coordinates independent of the direction of flight. The scenario investigated is when the inertial coordinates of the target, the aircraft location and the orientation angles are known that are required for accurate pointing. As stated previously, this system includes three separate coordinate systems. These coordinates systems and their orientations are shown in Fig. 5. The orientation for the coordinate systems is arbitrary, but ut they do need to be defined. In this system the inertial coordinates were defined so that the x-axis axis is in the North direction, the yyaxis is in the East direction and the z-axis axis is downward. The aircraft coordinates were defined with the xx-axis being the e same as the heading vector. The gimbal system is

Fig. 5. Orientation of Inertial Coordinate (Top), Aircraft Coordinate (Middle), Gimbal & Camera Coordinate (Bottom) The topic of coordinate inate transforms and the kinematics involved with flight have been well covered in the literature ([11]-[20]). [20]). Often it is advantageous to define a system using multiple coordinate systems. The problem with

73

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 multiple coordinate systems is that vector operations  cγ sγ 0    cannot be done if the vectors are defined by different (2) R 1g =  − sγ cγ 0 coordinate systems without transformations. Therefore,    0  0 1  the components of a vector in one coordinate can be transformed into a vector in a different coordinate system. The first transform that is needed for this analysis is a transform between the inertial system and the aircraft fixed system. For this transform, a 3-2-1 (NASA Standard Aircraft) rotation is used [13]. In this situation, ψ is the heading angle, or yaw, and corresponds to a rotation about the z-axis of the aircraft, θ is the pitch angle and is a rotation about the y-axis of the aircraft, and φ is the roll angle and is a rotation about the x-axis of the aircraft. These rotations are shown in Fig. 6. The order of rotation for this transform is ψ, θ, and then φ. The rotation matrix for this type of transform is well known and is shown in Eq. 1 as Ra.

R 2g =

R 3g =

sψ cθ cψ cφ + sψ sθ sφ cψ sφ + sψ sθ cφ

sθ  cθ sφ  cθ cφ 

1  0  0 

0 − sβ   1 0  0 0 cα − sα

[ ]

(3)

  

0  sα 

(4)

 cα 

R   g 

= R 1g R 2g  R 3g

R   g 

 cγ cβ = − sγ cβ  sβ

 

sγ cα + cγ sβ sα cγ cα − sγ sβ sα − cβ sα

(5) sγ sα − cγ sβ cα  cγ sα + sγ sβ cα   cβ cα

(6)

The order of rotation used for the Ra matrix is arbitrary, but using a different order will result in a different rotation matrix. The order of rotation for the Rg matrix will depend on the system setup and other systems may have different locations for the drive motors of each axis or they may move in a different order than what was used here. Whatever the order, these rotation matrices define the crucial relationship for transforming between coordinate systems.

Fig. 6.Aircraft rotations of ψ, θ, and φ about the z, y, and x axis respectively. cψ cθ   [Ra ] =  sψ cφ + cψ sθ sφ  sψ sφ + cψ sθ cφ

cβ   0   sβ 

Three-Axis Gimbal Pointing Stationary Targets

(1)

Angles

for

Known

In many remote sensing applications, a gimbal system is required to point a sensor at a given target location. To do this, the proper rotation angles (α, β, λ) must be given to the gimbal system so it can accurately locate the target. In this section an algorithm is developed to find these rotation angles. The inputs into the system are aircraft location in inertial coordinates (GPS information), aircraft attitude (roll, pitch, and yaw), target location in inertial coordinates, and the offset distance of the gimbal from the GPS receiver on the aircraft. If the GPS offset is not included or is not known, there will be errors in the angles of rotation given to the gimbal. The

The next transform needed is from the aircraft fixed coordinates to the gimbal’s camera fixed coordinates. This is done by first rotating about the z-axis by γ, then rotating about the y-axis by β, and then rotating about the x-axis by α. The rotation matrix for each of these individual rotations is shown in Eqs. 2 – 4 as R1g, R2g and R3g, respectively. To find the total rotation matrix needed, the rotation matrices from the individual rotations are multiplied together as shown in Eq. 5 with the total rotation matrix shown in Eq. 6 as Rg.

74

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 analysis for finding the rotation angles is done as if looking through the camera and maintaining the z axis of the {G ag }= [R] {G a } (12) gimbal system to point along the vector going from the gimbal location to the target location. ܿߣ ‫ ߣݏ‬0 Where, ܴ = ൥−‫ ߣܿ ߣݏ‬0൩ and λ = - ψ. 0 0 1 The first step in finding the rotation angles is to define the known locations in the form of vectors. In Eqs. Now the gimbal location needs to be transformed 7-9, the P vector is the position of the aircraft in inertial back into the inertial coordinates so that the vector coordinates, the T vector is the position of the target in between the gimbal and target can be defined. This is inertial coordinates and the GO vector is the offset of the done by multiplying the gimbal location by the inverse of gimbal in aircraft fixed coordinates. the systems rotation matrix, as shown in Eq. 13.

x p r   P =yp z  p 

 x p r      , T = yp  z     p

 x p r      , Go =  y p  z     p

      

{G} = ([R][Ra ])−1 − {Gag }

(7-9)

The line of sight vector is then defined as the difference between the gimbal location and the target location, both in inertial coordinates, as shown in Eq. 14.

The location of the gimbal needs to be expressed in inertial coordinates to define the line of sight vector. To do this the inertial coordinates of the aircraft must first be transformed into the body-fixed coordinate system of the aircraft. This is done by multiplying the inertial coordinates by the rotation matrix, as shown in Eq. 10.

{Pa } = [Ra ] {P}

∆x  {D} = {T } − {G} = ∆y   ∆z   

(14)

The line of sight vector then needs to be transformed into the body-fixed coordinates of the aircraft. This is done by using the aircraft rotation matrix, as shown in Eq. 15.

(10)

Once this is done, the gimbal offset, which is already in the body-fixed coordinates, can be subtracted from the aircraft coordinates to give the actual location of the gimbal in body-fixed coordinates (Eq. 11).

{G a } = {Pa } − {G o }

(13)

 ∆x a  {Da } = [R ][Ra ]{D} = ∆y a   ∆z   a

(15)

(11) The final rotation is to transform the pointing vector into the gimbal fixed coordinate system. The angles needed for this rotation are the angles that will be the pointing angles for the gimbal. Therefore they are unknown, but, by using specific constraints, the equations from the rotation will provide the pointing angles. The transformation is shown in Eq. 16.

The next step is to specify the required orientation of the camera image and this can be North, South, East or West. For example, to maintain the top of the camera image in North direction, the x unit vector in the gimbal system should always be aligned to the North direction, or in this case the inertial x-axis. Hence to maintain the camera orientation in Northern direction, Z axis rotation of the gimbal system (λ) should be equal and opposite to Z axis rotation of the aircrafts (ψ). Here, the gimbal location in aircraft coordinates needs to be transformed to the gimbal coordinate. This is done by multiplying the aircraft coordinate by the rotation matrix “R”, as shown in Eq. 12.

{Dg } = [Rg ]{Da} = (cγcβ)∆xa + (sγcα + cγsβsα )∆ya + (sγsα − cγsβcα )∆za  ∆xg      (16) (sγcβ)∆xa + (cγcα − sγsβsα )∆ya + (cγsα + sγsβcα )∆za  = ∆yg    ∆z  (sβ)∆xa + (-cβsα )∆ya + (cβcα )∆za    g

75

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 In this system, the z-axis of the camera falls along   ∆ y −1 a (23) the line of sight vector and this fact is used to solve the α = − tan   ∆ z  a equations for the unknown variables α, β, and γ. Therefore, when the line of sight vector is expressed in terms of the gimbal fixed coordinate system, it will not Similarly solving Eq. 21 with respect to β gives Eq. 24 have any components in the x or y direction of the gimbal   ∆xa system. This provides the following two equations (Eqs.  β = − tan −1 (24)  17-18).  sα ∆ya − cα ∆z a  ∆ x g = 0 ; ∆y g = 0

(17-18)

To maintain the camera orientation, the rotation about the Z axis of the gimbal system (λ) should be equal and opposite to ψ, as shown in Eq. 25.

Therefore, by applying Eqs. 17-18 in Eq. 16, it reduces to the following equations, as expressed in Eqs. 19-20.

(cγ cβ )∆x a + ( sγ cα + cγ sβ sα )∆y a + ( sγ sα - cγ sβ cα )∆z a = 0 ( sγ c β ) ∆ x a + ( cγ cα - sγ sβ sα ) ∆ y a + ( c γ sα + sγ sβ cα ) ∆ z a = 0

λ = −ψ

With all three of these rotation angles defined (α, β, λ), a controller can give precise commands to the gimbal system to accurately locate a target and orient the camera’s image, so that the top is always in the northern direction. The algorithm derived here is a novel, nonlinear closed form analytical expression (Eqs. 23-25) which uses position and attitude information of the aircraft to calculate the exact rotation angles for a three-axis gimbal system to point a digital imaging sensor at a target. Since the algorithm uses simple algebraic expressions to determine the gimbal pointing angles, pointing the imaging sensors towards the target is computational efficient. The authors are not aware of any other published algorithms that result in closed form solutions with heading corrections for use onboard an UAS.

(19)

(20)

This results in two equations with three unknown variables. To solve Eqs.(19-20), the three unknown variables are reduced to two unknown variables by finding any one of the variables α, β, or γ. For this system, the variable γ is found by defining a constraint. This constraint comes from the fact that the x axis of both the gimbal/camera fixed coordinate and the inertial coordinate have been chosen to be aligned (parallel) to each other. Since, the x axis of the gimbal coordinate is already aligned to the x axis (North direction) of the inertial coordinate, which is obtained by Eq. 12, the rotation about the z axis of the camera or γ is equal to zero in the gimbal coordinate system. Applying this constraint in Eqs.19-20, Eqs. 21-22 are found.

(cβ )∆ x a + ( sβ sα )∆ y a - (cγ sβ cα ) ∆ z a = 0

(21)

( cα ) ∆ y a + ( sα ) ∆ z a = 0

(22)

(25)

III. IMPLEMENTATION A. SUNDOG The Surveillance by University of North Dakota Observational Gimbal (SUNDOG) payload includes a three-axis precision pointing system for an EO camera and an uncooled thermal IR camera. This payload was developed to capture both the infrared and electro-optical video image of the ground which will be useful in agricultural and surveillance applications. Initially, a sensor operator on the ground was able to aim the two cameras in flight via a joystick control, useful for

Solving Eq. 22 with respect to α gives Eq. 23

76

www.ijitce.co.uk


FLIR Uncooled IR

SONY EO

3DM-GX1 IMU

Micromotor & Controller (x3)

Navigation System

EIDE 2.5” HDD

Camera Pointing System

INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 applications in surveillance and target tracking. To tasks such as receiving the ground station’s motor control eliminate the use of joystick and fully automate the target commands and sending IMU and GPS data to the ground tracking, the novel nonlinear closed form algorithm station. Wireless control of the payload motors from the presented was developed. This algorithm uses position ground station was achieved by using the PC/104+ stack. and attitude information of the aircraft to calculate the The motor control commands originate from a three-axis pointing angles of the gimbal system. Most of the joystick connected to the ground station via a high-speed hardware and software utilized in the payload design is USB port. Flight Gear recognizes the joystick input and commercial-off-the-shelf or share-ware in nature. Fig. 7 changes values within the program’s property tree provides a diagram of the commercial-off-the-shelf accordingly. Flight Gear’s property tree contains a list of (COTS) hardware utilized in the payload and ground flight control variables that can be used to manipulate a control station design. remote aircraft or other system. There are other variables within the property tree that, when changed by incoming Payload Ground Station data, communicate to Flight Gear the navigation, orientation, and system status of the payload. In the case PC/104+ of motor control, the three-axis joystick inputs are WLAN interpreted and assigned to three of the Flight Gear flight WLAN Dual-Channel Serial Ports 802.11a/b/g 802.11a/b/g Frame Grabber RS-232 control properties. While Flight Gear is running, it can NTSC constantly send these control property values to a PC/104+ designated IP address using the user datagram protocol SBC Crista IMU (UDP) communication protocol. Laptop w/ FlightGear On the payload side, custom C++ program was developed to set up a host socket to receive commands from the ground station IP address. The payload receives the UDP packets, which contain the control properties from Flight Gear. The programming code finds the desired properties, performs calculations on the property values, and sends the corresponding velocity commands to the three motion controllers through an RS-232 serial port. These parameters are translated into motor motion at the desired velocity in revolutions per minute. The C++ code compiled on the payload PC/104+ SBC utilizes POSIX threads or pthreads. These pthreads allow the C++ code continuously execute several operations, including scanning for incoming UDP packets, making calculations on the Flight Gear properties, and sending velocity commands to the motion controllers.

Garmin GPS

Joystick Controller

Fig. 7. COTS hardware and software diagram for the SUNDOG payload. B. Software The camera pointing system demands a complex coordination of software to accurately position the cameras based on ground station commands. The ground station runs Windows XP and includes applications such as Flight Gear, Putty, and VLC. Flight Gear is an opensource flight simulation application that allows wireless commands to be sent to the payload. The payload utilized a Linux operating system called OpenSuse 10.2 and includes OpenSSH and VLC applications along with a custom-written C++ program. Putty and OpenSSH permits secure system commands to be sent from the ground station to the payload. VLC wirelessly streams the video from the two cameras on the payload, which can then be received by the ground station and shown for near realtime feedback. The custom C++ program performs several

Last addition to the custom C++ program allows for position feedback from the motors. To achieve this, a method was added to the coding sequence, which sends a position query to each controller in the same manner that velocity commands are sent. The controller retrieves the position value from the incremental encoder and relays this value back to the PC/104+ SBC through the serial port as a string of ASCII characters. The C++

77

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 program parses the incoming serial data and outputs the IV. EXPERIMENTATION motor position. This ability to retrieve motor position is important in automated object tracking. In order to verify the tracking ability and the accuracy of the three-axis gimbal system, several experimentations The final feature added to the custom C++ program including a laboratory experiment, mobile ground vehicle was implementing the autonomous tracking algorithm test, and actual flight test were carried out. Initially, a which uses the position and attitude information of the laboratory experiment was conducted to verify the pointing aircraft. This program receives the INS/GPS information algorithm with change in attitude information by fixing the from the Piccolo autopilot and calculates the gimbal aircraft and the target location. Then an experiment using rotation angles based on the procedure explained in a mobile vehicle test was carried out by fixing the target Section II. After calculating the gimbal rotation in number location and changing the position of the payload. As the of ticks, the program sends the appropriate commands to payload moves along with the vehicle, the Piccolo updates the motor controller to rotate the gimbal. The payloads its GPS and attitude information which are then used to transmit the INS/GPS and gimbal rotation values real-time calculate the pointing angles of the gimbal system. While to the ground station. The processing time to compute the the pointing algorithm points the camera to the target rotational angles of the gimbal system is approximately location, a real time EO and IR video are simultaneously the same as that of the input frequency of the system. For streamed and saved to determine whether the camera is example, if the Piccolo sends the information to the pointing at a target or not. Similarly, flight tests with the program at a 20 Hz rate, then the processing rate of the payload was carried out by fixing the target location at the program will be approximately 20 Hz. There is not a ground and specify the target location to point the gimbal significant time delay in processing the algorithm because based on the GPS and attitude information of the aircraft. it computes the information within one millisecond. A. Actual Flight Test In order to find the processing time of the gimbal system to attain the desired position from its current position, an The flight test was carried out at the Camp analysis was carried out by using a timer function which Grafton Training Center (CGTC) with custom developed compute the exact time taken by the gimbal system to UAV called “Super Hauler,” owned and operates by the reach the desired position. The three rotational angles UASE, to verify the SUNDOG payload with autonomous (Alpha, Beta, Gamma) of the gimbal system were given as tracking algorithm. CGTC is a National Guard Maneuver 5 degrees individually and the algorithm was run to Training Center operated by the North Dakota National determine the time taken to rotate the gimbal system. This Guard. CGTC is located 45 miles South of Devils Lake in experiment yielded an average of 0.01 seconds to northeast North Dakota. Based on UASE team manipulate and rotate the gimbal system accordingly. requirement, the UAV “Super Hauler,” shown in Fig. 8, Based on the analysis, the gimbal system can operate at a was built by Bruce Tharpe Engineering. maximum frequency of approximately 100 Hz. Since the algorithm does not require significant computational time to process and rotate the gimbal system, it was implemented and demonstrated successfully in real-time using the onboard SBC at a rate of 20 Hz.

78

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 blue colored tarp of dimension 2 × 2 m was fixed on the ground and its GPS coordinate was used as the target Fig. 8. UND’s UAV “SuperHauler” location in the tracking algorithm. Fig. 9 shows the aircraft flight path obtained from the flight test in Google Earth. UAV “Super Hauler” is made of balsa wood and Apart from the preplanned flight path, different flight path plywood with aluminum and carbon fiber elements. It has patterns are carried out to test the target-locked and out of a wingspan of 3.7 meters, a wing area of 2.4 square range region. The term “Target-Locked” refers to the meters, and 3 meters of length. The dry weight of the aircraft position when the camera was pointed at the vehicle is 21.8 kilograms and has 13.6 kilograms of target and “Target-Out of Range” referred as the aircraft maximum payload carrying capacity. The main payload location when camera was not pointed to the target due to compartment has a volume of 0.054 cubic meters. the angular limitation of the gimbal system. The NED coordinate information is shown in the Fig. 10 which Additionally, there is an open space in the rear part of the illustrates that the aircraft was flown at a constant altitude fuselage of about 0.035 cubic meters used for placement during the autopilot stage. Fig. 11 shows the attitude of avionics (including autopilot).The BTE Super Hauler is information of the aircraft and Fig. 12 shows the rotation powered by a Desert Aircraft (DA) 100cc two stroke angles of a three-axis gimbal system in degrees. From Fig. 12, it can be noted that the gimbal rotational angles engine, whose specifications are shown in Table 1. follows an identical pattern during the autopilot stage, as expected due to the repeated flight path. TABLE I DA-100CC SPECIFICATION

Displacement

6.10 ci (100 cc)

Output

9.8 hp

Weight

5.8 lbs (2.63 kilos)

Bore

1.6771 in (42.6 mm)

Stroke Length

1.3779 in (35 mm)

RPM Range

1,000 to 6,700

Max. RPM

8,500

Fuel Consumption

1.172 gallons/hour @ 6000 RPM

Fig. 9. Aircraft’s flight path shown in Google Earth

Before the actual flight test, a preflight meeting was conducted with the internal and external pilot in command to discuss and decide the target location, actual flight path and the altitude required during the testing. A

79

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 NORTH

EAST

DOWN

600

200 0 -200

1 14 27 40 53 66 79 92 105 118 131 144 157 170 183 196 209 222 235 248 261 274 287 300 313 326 339 352 365 378 391 404 417 430 443 456

Aircraft location (meters)

400

-400 -600 -800 Time (sec)

the camera orientation always in the North direction regardless of the aircraft position and rotation. This is seen in the images that the target remains in the center and the road maintains its orientation while the aircraft moves along its flight path. Note the changes in GPS location, IMU readings, and resulting gimbal rotations for the snapshots shown.

Fig. 10. Aircraft’s Position – Flight Test Fig.10-12 shows that as the North value decreases, the Yaxis rotation of the gimbal system (beta) increases. This similar pattern was obtained during the MGV test. Fig.1315 shows the actual video snapshots obtained during the flight test with the EO camera. Since the orientation was specified to be in North direction, the algorithm maintains ROLL

PITCH

YAW

400

300 250 200 150 100 50 0 -50

1 14 27 40 53 66 79 92 105 118 131 144 157 170 183 196 209 222 235 248 261 274 287 300 313 326 339 352 365 378 391 404 417 430 443 456

Aircraft Rotation (degress)

350

Time (sec)

Fig. 11. Aircraft’s Attitude information – Flight Test

80

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 ALPHA

BETA

GAMMA

400 350

250 200 150 100 50 0 -50

1 13 25 37 49 61 73 85 97 109 121 133 145 157 169 181 193 205 217 229 241 253 265 277 289 301 313 325 337 349 361 373 385 397 409 421 433 445 457

Pointing angles (deg)

300

-100 -150 Time (sec)

Fig. 12. Gimbal Rotation based on GPS & IMU

Fig. 13. EO video snapshot # 1 obtained from the flight test.

81

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013

Fig. 14. EO video snapshot # 2 obtained from the flight test.

Fig. 15. EO video snapshot # 3 obtained from the flight test.

82

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 Based on the data obtained, an error analysis was compared with the distance moved by the pointing carried out to estimate the tracking accuracy. The actual algorithm to calculate the accuracy of the tracking distance between the aircraft and the target location are algorithm. Fig. 16 shows the graphical form of target offset measured in terms of NED coordinates, and then obtained in the flight test experimentation.

Fig. 16. Target Offset – Flight Test In Fig. 16, green circle represents the maximum the data because the angular error obtained in terms of field of view of the camera (500 m) at an altitude of 300 m North and East direction are well within the maximum and the blue circle represents the error obtained from the possible error ±6 degrees, which includes the IMU error flight test including the IMU error (±2 deg), GPS error (3-6 (±2 degrees), gimbal manual setup error (±2 degrees), m) and gimbal manual setup error (±2 deg). The results and GPS error (3 - 6 meters). obtained from the actual flight test are more significant and promising because the target offsets are well within Fig. 17-18 shows the aircraft location when the the maximum error possible range, the target was always target was locked and out of range, respectively. In Fig. in the field of view of the camera when the target was 17, the green pattern represents that the target was locked and the camera orientation was always aligned in locked and the red pattern represents the target was out the specified direction (North). The accuracy of the of range due to the angular restriction in the gimbal algorithm obtained from the flight test is 95 % with an system. The gimbal rotation angles are restricted (Alpha = average of 22 meters and a standard deviation of 8 ± 25 deg& Beta = ± 75 deg) due to the physical structure meters. Another error analysis was carried out to find out and wiring connection of the gimbal. Since the roll the angular error occurred due to the target offset. The limitation of the gimbal system (± 25 deg), the target angular error is calculated based on target offset obtained tracking was restricted to a minimum distance in the Eastand the normalize distance between the aircraft and the West direction. The distance varies depending upon the target location. Based on the angular error analysis altitude of the aircraft. This is the current condition, but by calculation, an average error of 3.45 degrees and -1.77 increasing the roll limit to 80 degrees (future), the aircraft degrees are obtained in the East and North direction, can have broader aerial field of view to track the target, as respectively. The results obtained from the angular error shown in Fig. 18. analysis shows a significant and better understanding of

83

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 Based on the error analysis and the EO video images obtained from the flight test, we can conclude that the target specified was always centered in the field of view of the camera when the target was locked. The results obtained from the flight test are very significant and promising because the target was well within the maximum error possible range and the camera was always oriented in the specified direction (North). V. CONCLUSION

A novel non-linear, closed form analytical expression for the three-axis gimbal system was derived to calculate the exact pointing angles with specified orientation based on GPS and IMU information of the aircraft. The orientation of the camera can be specified in any direction (North, East, West, and South). The autonomous target tracking (ATT) algorithm was successfully implementation in a payload called “SUNDOG”- Surveillance by University of North Dakota Observational Gimbal installed in a small UAS. ATT was tested through several experimentation including a laboratory, a MGV test, and actual flight tests. An average accuracy of 95 %, or 5 % inaccuracy, with a standard deviation of 8 meters and an average of 22 meters was obtained in the flight experiments. The results obtained from the experiments are very significant and promising since the target was always in the field of view of the camera and within the maximum error possible range including the IMU, GPS and gimbal manual setup error. Also, the target was always maintained in the specified orientation (North). Without any human interface, the ATT algorithm was demonstrated successfully in real-time using a custom developed C++ program in an UAS flight.

Fig. 17. Target Locked and Out of Range when Alpha = ± 25 deg& Beta = ± 75 deg

ACKNOWLEDGMENTS University of North Dakota Department of Mechanical Engineering and the UASE Laboratory, North Dakota Department of Commerce, United States Air Force UAV Battle Lab contract number FA4861-06-C-C006, NASA Grant NNG05WC01A, Intercollegiate Academic Funding (UND), Research Development and Compliance (RD&C).

Fig. 18. Target Locked and Out of Range when Alpha = ± 80 deg & Beta = ± 75 deg

84

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013 [14] W. F. Phillips, C. E. Hailey, and G. A. Gebert, “Review of Attitude Representations Used for Aircraft Kinematics,” Journal of Aircraft, 38 (4), 718-737 (2001). [15] P.C. Hughes, “Spacecraft Attitude Dynamics,” Dover Publications, Inc., Mineola, New York (2004). [16] J. M. Hilkert, “Kinematic Algorithms for Line-of-Sight Pointing and Scanning using INS/GPS Position and Velocity Information,” Proceedings of SPIE, 5810, 11-22 (2005). [17] Weiss, H., “Quaternion-Based Rate/Attitude Tracking System with Application to Gimbal Attitude Control,” Journal of Guidance, Control, and Dynamics, 16 (4), 609-616 (1993). [18] M. Quigley, M.A. Goodrich, S. Griffiths, A. Eldredge, and R.W. Beard, “Target Acquisition, Localization, and Surveillance Using a Fixed-Wing Mini-UAV and Gimbaled Camera,” IEEE - ICRA, 2600-2605 (2005). [19] Yoon Sugpil, and J.B. Lundberg, “Equations of Motion for a Two-Axes Gimbal System”, Aerospace and Electronic Systems, IEEE Transactions, 37 (3), 1083–1091 (2001). [20] P. Goodell, M. Lendway, D. Dvorak, A. Rhoads, and S. Trandem, “UAV payload and Ground Station Development: Camera Pointing System”, Senior Design Project Report, University of North Dakota (2007).

REFERENCES [1] W. Semke, J. Ranganathan, and M. Buisker, “Active Gimbal Control for Surveillance using Small Unmanned Aircraft Systems,” International Modal Analysis Conference XXVI (2008). [2] J. Ranganathan, and W. Semke, “Three-Axis Gimbal Surveillance Algorithms for Use in Small UAS,” ASME IMECE, 1, 31-40 (2008). [3] Lendway, M., Berseth, B., Trandem, S., Schultz, R., and Semke, W., “Integration and Flight of a University-Designed UAV Payload in an Industry-Designed Airframe,” Proceeding of the Association Unmanned Vehicle Systems International (2007). [4] W. Semke, R. Schultz, D. Dvorak, S. Trandem, B. Berseth, and M. Lendway, "Utilizing UAV Payload Design by Undergraduate Researchers for Educational and Research Development," ASME IMECE, 7, 113-120 (2007). [5] M. Lendway, B. Berseth, F. Martel, S. Trandem, and K. Anderson, “A University-Designed Thermal-Optical Imaging Payload for Demonstration in a Small Experimental UAS,” AIAA ( 2007). [6] Bal, A., and Alam, M.S., “Automatic Target Tracking in FLIR Image Sequences Using Intensity Variation Function and Template Modeling”, IEEE Transactions on Instrumentation and Measurement, 54 (5), 1846-1852 (2005). [7] M. Kirchhof, and U. Stilla, “Detection of moving objects in airborne thermal videos”,ISPRS Journal of Photogrammetry and Remote Sensing, 61 (3-4), 187-196 (2006). [8] J. P. Le Cadre, Olivier Trimois, “Bearings-only tracking for maneuvering sources”, IEEE Transactions on Aerospace and Electronic Systems, 34 (1), 179-199 (1998). [9] A.J. Lipton, H. Fujiyoshi, R.S. Patil, “Moving Target Classification and Tracking from Real-time Video,” Proc. IEEE Workshop Application of Computer Vision, 8-14 (1998). [10] Carl C. Liebe, Kenneth A. Brown, SurapholUdomkesmalee, Curtis W. Padgett, Michael P. Brenner, Ayanna M. Howard, Terry R. Wysocky, David I. Brown, and Steven C. Suddarth “VIGIL - a GPS based target-tracking system”, Proc. SPIE, 3365, 10 (1998). [11] J. Ranganathan, “Closed Form Analytical Multi-Axis Gimbal Tracking Algorithms for use in Small Unmanned Aircraft Vehicles”, M.S. Mechanical Engineering, University of North Dakota (2008). [12] Buisker, M., “Statistically Significant Factors that Affect the Pointing Accuracy of Airborne Remote Sensing Payloads,” M.S. Mechanical Engineering, University of North Dakota, (2007). [13] H. Baruh, “Analytical Dynamics,” WCB/McGraw-Hill (1999)

85

www.ijitce.co.uk


INTERNATIONAL JOURNAL OF INNOVATIVE TECHNOLOGY AND CREATIVE ENGINEERING (ISSN:2045-8711) VOL.3 NO.5 MAY 2013

86

www.ijitce.co.uk


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.