Koozali.org: home of the SME Server
Obsolete Releases => SME Server 7.x => Topic started by: lucho115 on July 12, 2006, 04:30:27 PM
-
i have a new install of sme 7.0 final, iam try it with only 1 user and its really likes me.So i leave my server up all night and today morning i run "top" to see the stats and i see that my cpu sys es 74% in use , but no user using the server and the clamav is disable, i dont know what happend but i could see that a process "runsv" is eating 15 % of my cpu anf other "runsvdir" the 5 % of the cpu.
So why if tomorrow when i went to bed, the cpu sys % was 0.3% and now is 74% , somebody can helpme to solve this issue?
why this 2 process "runsv" and "runsvdir" are eating the 20% of my cpu if yesterday only i could see ther runsv and sleping.
i have a P4 HT 3.0 GHZ 2mb cache, and 2 GB of DDR2 RAM
and this is the picture now:
top - 11:17:52 up 19:22, 3 users, load average: 2.86, 2.80, 2.74
Tasks: 171 total, 4 running, 167 sleeping, 0 stopped, 0 zombie
Cpu(s): 27.2% us, 72.8% sy, 0.0% ni, 0.0% id, 0.0% wa, 0.0% hi, 0.0% si
Mem: 2072844k total, 408804k used, 1664040k free, 52152k buffers
Swap: 2031608k total, 0k used, 2031608k free, 195108k cached
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
1893 root 25 0 1692 224 180 R 13.7 0.0 170:37.51 runsv
1701 root 15 0 2932 304 244 S 5.9 0.0 50:54.97 runsvdir
1 root 16 0 2640 628 544 S 0.0 0.0 0:00.46 init
2 root RT 0 0 0 0 S 0.0 0.0 0:00.00 migration/0
3 root 34 19 0 0 0 R 0.0 0.0 0:06.11 ksoftirqd/0
4 root 5 -10 0 0 0 S 0.0 0.0 0:00.00 events/0
5 root 5 -10 0 0 0 S 0.0 0.0 0:00.00 khelper
6 root 5 -10 0 0 0 S 0.0 0.0 0:00.00 kblockd/0
7 root 15 0 0 0 0 S 0.0 0.0 0:00.00 khubd
38 root 20 0 0 0 0 S 0.0 0.0 0:00.00 pdflush
39 root 15 0 0 0 0 S 0.0 0.0 0:00.22 pdflush
41 root 12 -10 0 0 0 S 0.0 0.0 0:00.00 aio/0
36 root 15 0 0 0 0 S 0.0 0.0 1:09.68 kapmd
40 root 25 0 0 0 0 S 0.0 0.0 0:00.00 kswapd0
115 root 25 0 0 0 0 S 0.0 0.0 0:00.00 kseriod
185 root 5 -10 0 0 0 S 0.0 0.0 0:00.00 ata/0
188 root 23 0 0 0 0 S 0.0 0.0 0:00.00 scsi_eh_0
189 root 23 0 0 0 0 S 0.0 0.0 0:00.00 scsi_eh_1
213 root 11 -10 0 0 0 S 0.0 0.0 0:00.00 kmirrord
214 root 11 -10 0 0 0 S 0.0 0.0 0:00.00 kmir_mon
222 root 15 0 0 0 0 S 0.0 0.0 0:00.65 md2_raid1
223 root 15 0 0 0 0 S 0.0 0.0 0:00.00 md1_raid1
228 root 15 0 0 0 0 S 0.0 0.0 0:09.36 kjournald
1109 root 6 -10 3368 444 360 S 0.0 0.0 0:00.00 udevd
1415 root 6 -10 0 0 0 S 0.0 0.0 0:00.00 kauditd
1464 root 16 0 0 0 0 S 0.0 0.0 0:00.00 kjournald
1674 root 19 0 2232 404 348 S 0.0 0.0 0:00.00 mingetty
1679 root 16 0 1604 400 348 S 0.0 0.0 0:00.00 mingetty
1838 root 16 0 2980 224 180 S 0.0 0.0 0:00.00 runsv
1839 root 16 0 1356 224 180 S 0.0 0.0 0:00.00 runsv
1840 root 16 0 2468 220 180 S 0.0 0.0 0:00.00 runsv
1841 root 16 0 1332 224 180 S 0.0 0.0 0:00.00 runsv
1842 root 16 0 2844 224 180 S 0.0 0.0 0:00.00 runsv
1843 root 16 0 2652 224 180 S 0.0 0.0 0:00.00 runsv
1844 root 16 0 2180 224 180 S 0.0 0.0 0:00.00 runsv
1845 root 16 0 2340 224 180 S 0.0 0.0 0:00.00 runsv
1846 root 16 0 2088 220 180 S 0.0 0.0 0:00.00 runsv
1847 root 16 0 1796 224 180 S 0.0 0.0 0:00.00 runsv
1848 root 16 0 2228 224 180 S 0.0 0.0 0:00.00 runsv
1849 root 16 0 3316 224 180 S 0.0 0.0 0:00.00 runsv
1850 root 16 0 2108 224 180 S 0.0 0.0 0:00.00 runsv
1851 root 16 0 3204 224 180 S 0.0 0.0 0:00.00 runsv
1852 root 16 0 2140 224 180 S 0.0 0.0 0:00.00 runsv
1853 root 16 0 1364 224 180 S 0.0 0.0 0:00.00 runsv
1854 root 16 0 2756 224 180 S 0.0 0.0 0:00.00 runsv
1855 root 16 0 1544 220 180 S 0.0 0.0 0:00.00 runsv
1856 root 16 0 2588 224 180 S 0.0 0.0 0:00.00 runsv
1857 root 16 0 3188 224 180 S 0.0 0.0 0:00.00 runsv
1858 root 16 0 2432 220 180 S 0.0 0.0 0:00.00 runsv
1859 root 16 0 1448 220 180 S 0.0 0.0 0:00.00 runsv
1860 root 16 0 1824 220 180 S 0.0 0.0 0:00.00 runsv
1861 root 16 0 2724 224 180 S 0.0 0.0 0:00.00 runsv
1862 root 16 0 3332 224 180 S 0.0 0.0 0:00.00 runsv
1863 root 16 0 3252 224 180 S 0.0 0.0 0:00.00 runsv
1864 root 16 0 1828 224 180 S 0.0 0.0 0:00.00 runsv
1865 root 16 0 1392 220 180 S 0.0 0.0 0:00.00 runsv
1866 root 16 0 2016 220 180 S 0.0 0.0 0:00.00 runsv
1867 smelog 17 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1868 smelog 18 0 1320 264 220 S 0.0 0.0 0:00.00 multilog
1869 qmaill 18 0 1320 264 220 S 0.0 0.0 0:00.00 multilog
1870 root 16 0 2084 224 180 S 0.0 0.0 0:00.00 runsv
1871 root 16 0 2084 224 180 S 0.0 0.0 0:00.00 runsv
1872 root 16 0 2008 220 180 S 0.0 0.0 0:00.00 runsv
1873 smelog 18 0 1320 264 220 S 0.0 0.0 0:00.00 multilog
1874 smelog 16 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1875 smelog 18 0 1320 264 220 S 0.0 0.0 0:00.00 multilog
1876 smelog 15 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1877 smelog 15 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1878 nobody 16 0 9708 6476 1912 S 0.0 0.3 0:00.21 smtp-auth-proxy
1879 smelog 15 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1880 smelog 17 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1881 root 16 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1882 smelog 16 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1883 smelog 15 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1884 smelog 16 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1885 smelog 18 0 1320 264 220 S 0.0 0.0 0:00.00 multilog
1886 smelog 16 0 1320 268 224 S 0.0 0.0 0:00.01 multilog
1887 root 16 0 3004 524 432 S 0.0 0.0 0:00.01 ulogd
1888 imaplog 17 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1889 imaplog 17 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1890 smelog 18 0 1320 264 220 S 0.0 0.0 0:00.00 multilog
1891 cvmlog 16 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1892 root 18 0 1916 236 180 S 0.0 0.0 0:00.00 cvm-unix
1894 root 16 0 2792 220 180 S 0.0 0.0 0:00.00 runsv
1895 root 16 0 2872 220 180 S 0.0 0.0 0:00.00 runsv
1896 root 16 0 2980 224 180 S 0.0 0.0 0:00.00 runsv
1897 smelog 18 0 1320 264 220 S 0.0 0.0 0:00.00 multilog
1898 smelog 17 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1899 dnslog 15 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1900 dnscache 16 0 2740 1372 304 S 0.0 0.1 0:00.02 dnscache
1901 smelog 15 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1902 dnslog 16 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1903 smelog 16 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1904 qmaill 15 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1905 smelog 16 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1906 smelog 17 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1907 smelog 17 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1908 smelog 16 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
1909 dnslog 15 0 1320 268 224 S 0.0 0.0 0:00.00 multilog
2346 root 16 0 36872 31m 2324 S 0.0 1.6 0:01.57 spamd
2434 root 16 0 1912 640 536 S 0.0 0.0 0:00.04 syslogd
2438 root 16 0 3296 1384 464 S 0.0 0.1 0:00.04 klogd
2483 root 16 0 3020 472 392 S 0.0 0.0 0:00.00 mdadm
2518 root 16 0 1716 484 428 S 0.0 0.0 0:00.00 apmd
2720 spamd 16 0 36872 31m 2332 S 0.0 1.6 0:00.00 spamd
2723 spamd 16 0 36872 31m 2332 S 0.0 1.6 0:00.00 spamd
2993 root 16 0 3364 668 576 S 0.0 0.0 0:00.00 bwbar
5472 root 16 0 6312 1108 704 S 0.0 0.1 0:00.00 crond
5639 dnscache 16 0 1956 580 304 S 0.0 0.0 0:00.15 dnscache
5690 root 16 0 56 44 28 S 0.0 0.0 0:00.00 tcpsvd
5711 root 18 0 56 44 28 S 0.0 0.0 0:00.00 tcpsvd
5781 root 15 0 52 40 28 S 0.0 0.0 0:00.00 tcpsvd
5795 root 16 0 52 40 28 S 0.0 0.0 0:00.00 tcpsvd
5821 dns 16 0 1428 296 236 S 0.0 0.0 0:00.00 tinydns
5950 lp 18 0 5660 1216 1048 S 0.0 0.1 0:00.00 lpd
5982 root 16 0 3044 1364 832 S 0.0 0.1 0:00.00 dhcpd
5998 lp 16 0 5660 1224 1056 S 0.0 0.1 0:00.00 lpd
6035 clamav 25 0 27384 13m 1200 S 0.0 0.7 0:02.32 clamd
6055 clamav 15 0 5400 1380 1168 S 0.0 0.1 0:00.01 freshclam
6105 ldap 17 0 13300 3264 2220 S 0.0 0.2 0:00.03 slapd
6139 root 16 0 4388 4388 3428 S 0.0 0.2 0:00.02 ntpd
6168 qmails 16 0 3240 396 300 S 0.0 0.0 0:00.01 qmail-send
6176 root 15 0 1744 348 268 S 0.0 0.0 0:00.00 qmail-lspawn
6177 qmailr 18 0 1888 336 256 S 0.0 0.0 0:00.00 qmail-rspawn
6178 qmailq 15 0 2840 348 268 S 0.0 0.0 0:00.00 qmail-clean
6219 root 15 0 52 40 28 S 0.0 0.0 0:00.00 tcpsvd
6231 qpsmtpd 16 0 12292 10m 1960 S 0.0 0.5 0:00.40 qpsmtpd-forkser
6250 root 18 0 52 40 28 S 0.0 0.0 0:00.00 tcpsvd
6298 root 16 0 5068 1664 1364 S 0.0 0.1 0:00.00 sshd
6317 root 16 0 7736 3216 2344 S 0.0 0.2 0:00.32 httpd-admin
6340 root 16 0 37760 9508 6420 S 0.0 0.5 0:00.70 httpd
6398 root 17 0 57676 3372 1748 S 0.0 0.2 0:00.09 radiusd
6410 admin 16 0 7904 3424 2464 S 0.0 0.2 0:00.02 httpd-admin
6471 root 16 0 2900 652 556 S 0.0 0.0 0:00.00 atalkd
6490 root 16 0 6156 1908 1460 S 0.0 0.1 0:00.07 nmbd
6496 root 18 0 9676 2720 2092 S 0.0 0.1 0:00.07 smbd
6569 www 16 0 38208 10m 6924 S 0.0 0.5 0:00.05 httpd
6570 www 16 0 38056 9.8m 6664 S 0.0 0.5 0:00.01 httpd
6571 www 16 0 42800 14m 9352 S 0.0 0.7 0:00.20 httpd
6572 www 16 0 40756 12m 8640 S 0.0 0.6 0:00.08 httpd
6604 www 16 0 38252 9m 6644 S 0.0 0.5 0:00.03 httpd
6605 www 16 0 42460 14m 9160 S 0.0 0.7 0:00.18 httpd
6606 www 16 0 37992 9880 6640 S 0.0 0.5 0:00.00 httpd
6607 www 16 0 38248 9.9m 6644 S 0.0 0.5 0:00.03 httpd
6608 www 16 0 38008 9924 6640 S 0.0 0.5 0:00.00 httpd
6610 www 16 0 37992 9872 6640 S 0.0 0.5 0:00.00 httpd
6634 dbus 15 0 3224 1204 1072 S 0.0 0.1 0:00.03 dbus-daemon-1
6647 root 18 0 9676 2712 2084 S 0.0 0.1 0:00.00 smbd
6692 root 16 0 8608 5484 1604 S 0.0 0.3 0:01.99 hald
6848 mysql 15 0 36204 8460 2988 S 0.0 0.4 0:00.03 mysqld
7474 root 17 0 3036 1248 976 S 0.0 0.1 0:00.00 login
17469 root 19 0 2368 568 476 S 0.0 0.0 0:00.00 papd
17473 root 16 0 4424 692 584 S 0.0 0.0 0:00.00 cnid_metad
17489 root 17 0 8168 1572 1284 S 0.0 0.1 0:00.00 afpd
17942 admin 16 0 7904 3420 2464 S 0.0 0.2 0:00.01 httpd-admin
18650 admin 16 0 7904 3420 2464 S 0.0 0.2 0:00.00 httpd-admin
22108 root 15 0 5576 1408 1156 S 0.0 0.1 0:00.00 bash
26009 root 16 0 2232 984 740 S 0.0 0.0 0:13.82 top
22448 www 16 0 40756 12m 8640 S 0.0 0.6 0:00.08 httpd
31147 root 15 0 8728 1936 1560 S 0.0 0.1 0:00.77 sshd
987 root 15 0 5196 1420 1168 S 0.0 0.1 0:00.07 bash
11159 root 15 0 6804 1940 1560 S 0.0 0.1 0:00.81 sshd
12533 root 15 0 6032 1416 1164 S 0.0 0.1 0:00.04 bash
31013 root 17 0 5200 712 452 S 0.0 0.0 0:00.00 man
31020 root 17 0 5532 1008 880 S 0.0 0.0 0:00.00 sh
31021 root 23 0 5532 1048 920 S 0.0 0.1 0:00.00 sh
31025 root 15 0 6088 1152 1020 S 0.0 0.1 0:00.00 nroff
31026 root 15 0 6132 660 556 S 0.0 0.0 0:00.00 less
31040 root 16 0 4060 712 528 S 0.0 0.0 0:00.00 iconv
14636 root 16 0 2036 876 648 R 0.0 0.0 0:00.00 top
14776 root 25 0 1692 224 180 R 0.0 0.0 0:00.00 runsv
can somebody help me?
sorry about my english , iam from argentina
bye
-
Hi Lucho
What is the output from the top command when the CPU is doing 74%?
We are finding clamd taking up a lot of CPU useage when scanning the drive.
-
lucho115
> i dont know what happend but i could see that a process "runsv" is
> eating 15 % of my cpu anf other "runsvdir" the 5 % of the cpu.
> why this 2 process "runsv" and "runsvdir" are eating the 20% of my cpu
> if yesterday only i could see ther runsv and sleping.
http://smarden.org/runit/runsv.8.html
http://smarden.org/runit/runsvdir.8.html
Try
top -i
to show only the active processes, also
htop
Here's similar but older beta release issue
http://forums.contribs.org/index.php?topic=28623.0
Are you sure you have the final release CD ?
Have you installed any contribs ?
-
lucho115
> i dont know what happend but i could see that a process "runsv" is
> eating 15 % of my cpu anf other "runsvdir" the 5 % of the cpu.
> why this 2 process "runsv" and "runsvdir" are eating the 20% of my cpu
> if yesterday only i could see ther runsv and sleping.
http://smarden.org/runit/runsv.8.html
http://smarden.org/runit/runsvdir.8.html
Try
top -i
to show only the active processes, also
htop
Here's similar but older beta release issue
http://forums.contribs.org/index.php?topic=28623.0
Are you sure you have the final release CD ?
Have you installed any contribs ?
Yes is the final release, and yes a install the sme7admin contrib to monitor the system, and then do the same in other sme 7 final fresh install, and again 74% in use of the cpu after 2 days, my solution is no to install any monitor software, because i try other with the same problem, and in the contrib bugtracker nothing is reported and a lot of people is using this contribs
-
I've got the same issue here. Since a few weeks runsv and runsvdir are eating about 25% of my cpu time even when the server is not used.
SME 7.1.3
P4 2.8GHz
1GB
Since the same time (I guess) I receive mails from my server about a "A DegradedArray event has been detected on md device /dev/md2."
Maybe a connection? :roll:
-
Hi,
Today I updated my server with yum to 7.2 and guess what..? The high cpu time of the runsv and runsvdir process is gone. Everything looks normal again in top.
:D :D
Love this server.
-
123chris
> "A DegradedArray event has been detected on md device /dev/md2."
Did you do anything about your degraded array ?
What does this show ?
cat /proc/mdstat
-
RayMitchell,
No, I can't solve that problem. :( I searched this and other forums but no topic seems to have the solution for this problem (in a single disk config).
I also contacted a poster of this forum who's topic title said [Solved]. I asked him the solution and he replied to me that I should turn off the RaidMonitor. Maybe it helps to get rid of the returning e-mails, but doesn't solve the problem I think.
Here's the output you requested:
[root@sme ~]# cat /proc/mdstat
Personalities : [raid1]
md2 : active raid1 sda2[0]
78043648 blocks [2/1] [U_]
md1 : active raid1 sda1[0]
104320 blocks [2/1] [U_]
unused devices: <none>
Do you have a solution for this?
-
123chris
>....in a single disk config
If you only have one hard disk, then that error message is expected to happen. Note the system will happily function in degraded mode (ie with one disk only).
>...I should turn off the RaidMonitor
As you deliberately only have one disk, then there is no actual problem. Yes disabling the RaidMonitor is a satisfactory answer.
-
Ray,
Thanks for your answer. I guess I'll do that.
I still think its stange that I didn't recieve this mails in the beginning. Something must have changed since then. :wink:
Thanks.